Feb 24 02:54:34 crc systemd[1]: Starting Kubernetes Kubelet... Feb 24 02:54:34 crc restorecon[4674]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:34 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 02:54:35 crc restorecon[4674]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 24 02:54:35 crc restorecon[4674]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 24 02:54:37 crc kubenswrapper[4923]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 24 02:54:37 crc kubenswrapper[4923]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 24 02:54:37 crc kubenswrapper[4923]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 24 02:54:37 crc kubenswrapper[4923]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 24 02:54:37 crc kubenswrapper[4923]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 24 02:54:37 crc kubenswrapper[4923]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.193033 4923 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198194 4923 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198224 4923 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198228 4923 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198232 4923 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198237 4923 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198240 4923 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198244 4923 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198249 4923 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198253 4923 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198258 4923 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198263 4923 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198268 4923 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198272 4923 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198277 4923 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198280 4923 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198284 4923 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198288 4923 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198306 4923 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198310 4923 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198314 4923 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198318 4923 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198321 4923 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198326 4923 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198329 4923 feature_gate.go:330] unrecognized feature gate: Example Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198333 4923 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198338 4923 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198342 4923 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198347 4923 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198350 4923 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198355 4923 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198359 4923 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198363 4923 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198366 4923 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198370 4923 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198378 4923 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198383 4923 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198386 4923 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198390 4923 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198394 4923 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198397 4923 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198401 4923 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198405 4923 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198409 4923 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198412 4923 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198416 4923 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198420 4923 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198423 4923 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198427 4923 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198432 4923 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198436 4923 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198440 4923 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198444 4923 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198448 4923 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198452 4923 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198456 4923 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198459 4923 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198463 4923 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198467 4923 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198470 4923 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198474 4923 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198477 4923 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198481 4923 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198484 4923 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198487 4923 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198491 4923 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198495 4923 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198498 4923 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198503 4923 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198506 4923 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198510 4923 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.198515 4923 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199687 4923 flags.go:64] FLAG: --address="0.0.0.0" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199705 4923 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199714 4923 flags.go:64] FLAG: --anonymous-auth="true" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199722 4923 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199730 4923 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199735 4923 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199742 4923 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199749 4923 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199755 4923 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199762 4923 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199768 4923 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199775 4923 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199781 4923 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199786 4923 flags.go:64] FLAG: --cgroup-root="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199791 4923 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199796 4923 flags.go:64] FLAG: --client-ca-file="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199801 4923 flags.go:64] FLAG: --cloud-config="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199806 4923 flags.go:64] FLAG: --cloud-provider="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199811 4923 flags.go:64] FLAG: --cluster-dns="[]" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199818 4923 flags.go:64] FLAG: --cluster-domain="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199823 4923 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199829 4923 flags.go:64] FLAG: --config-dir="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199834 4923 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199839 4923 flags.go:64] FLAG: --container-log-max-files="5" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199846 4923 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199851 4923 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199857 4923 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199862 4923 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199868 4923 flags.go:64] FLAG: --contention-profiling="false" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199873 4923 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199878 4923 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199884 4923 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199889 4923 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199896 4923 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199901 4923 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199906 4923 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199911 4923 flags.go:64] FLAG: --enable-load-reader="false" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199917 4923 flags.go:64] FLAG: --enable-server="true" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199922 4923 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199929 4923 flags.go:64] FLAG: --event-burst="100" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199934 4923 flags.go:64] FLAG: --event-qps="50" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199940 4923 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199945 4923 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199951 4923 flags.go:64] FLAG: --eviction-hard="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199958 4923 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199964 4923 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199969 4923 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199976 4923 flags.go:64] FLAG: --eviction-soft="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199982 4923 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199988 4923 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199993 4923 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.199999 4923 flags.go:64] FLAG: --experimental-mounter-path="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200003 4923 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200009 4923 flags.go:64] FLAG: --fail-swap-on="true" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200014 4923 flags.go:64] FLAG: --feature-gates="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200021 4923 flags.go:64] FLAG: --file-check-frequency="20s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200027 4923 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200032 4923 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200038 4923 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200044 4923 flags.go:64] FLAG: --healthz-port="10248" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200049 4923 flags.go:64] FLAG: --help="false" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200054 4923 flags.go:64] FLAG: --hostname-override="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200059 4923 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200064 4923 flags.go:64] FLAG: --http-check-frequency="20s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200070 4923 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200075 4923 flags.go:64] FLAG: --image-credential-provider-config="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200080 4923 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200085 4923 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200092 4923 flags.go:64] FLAG: --image-service-endpoint="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200097 4923 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200102 4923 flags.go:64] FLAG: --kube-api-burst="100" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200108 4923 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200114 4923 flags.go:64] FLAG: --kube-api-qps="50" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200119 4923 flags.go:64] FLAG: --kube-reserved="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200125 4923 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200131 4923 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200136 4923 flags.go:64] FLAG: --kubelet-cgroups="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200141 4923 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200146 4923 flags.go:64] FLAG: --lock-file="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200151 4923 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200157 4923 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200162 4923 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200177 4923 flags.go:64] FLAG: --log-json-split-stream="false" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200184 4923 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200189 4923 flags.go:64] FLAG: --log-text-split-stream="false" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200194 4923 flags.go:64] FLAG: --logging-format="text" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200199 4923 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200206 4923 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200211 4923 flags.go:64] FLAG: --manifest-url="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200217 4923 flags.go:64] FLAG: --manifest-url-header="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200224 4923 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200230 4923 flags.go:64] FLAG: --max-open-files="1000000" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200236 4923 flags.go:64] FLAG: --max-pods="110" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200242 4923 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200248 4923 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200254 4923 flags.go:64] FLAG: --memory-manager-policy="None" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200259 4923 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200265 4923 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200270 4923 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200275 4923 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200308 4923 flags.go:64] FLAG: --node-status-max-images="50" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200314 4923 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200319 4923 flags.go:64] FLAG: --oom-score-adj="-999" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200325 4923 flags.go:64] FLAG: --pod-cidr="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200330 4923 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200340 4923 flags.go:64] FLAG: --pod-manifest-path="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200346 4923 flags.go:64] FLAG: --pod-max-pids="-1" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200351 4923 flags.go:64] FLAG: --pods-per-core="0" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200356 4923 flags.go:64] FLAG: --port="10250" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200361 4923 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200366 4923 flags.go:64] FLAG: --provider-id="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200371 4923 flags.go:64] FLAG: --qos-reserved="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200376 4923 flags.go:64] FLAG: --read-only-port="10255" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200381 4923 flags.go:64] FLAG: --register-node="true" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200386 4923 flags.go:64] FLAG: --register-schedulable="true" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200391 4923 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200401 4923 flags.go:64] FLAG: --registry-burst="10" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200406 4923 flags.go:64] FLAG: --registry-qps="5" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200412 4923 flags.go:64] FLAG: --reserved-cpus="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200419 4923 flags.go:64] FLAG: --reserved-memory="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200426 4923 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200431 4923 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200437 4923 flags.go:64] FLAG: --rotate-certificates="false" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200442 4923 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200448 4923 flags.go:64] FLAG: --runonce="false" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200453 4923 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200459 4923 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200465 4923 flags.go:64] FLAG: --seccomp-default="false" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200470 4923 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200476 4923 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200482 4923 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200487 4923 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200493 4923 flags.go:64] FLAG: --storage-driver-password="root" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200498 4923 flags.go:64] FLAG: --storage-driver-secure="false" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200503 4923 flags.go:64] FLAG: --storage-driver-table="stats" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200509 4923 flags.go:64] FLAG: --storage-driver-user="root" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200515 4923 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200520 4923 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200526 4923 flags.go:64] FLAG: --system-cgroups="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200532 4923 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200541 4923 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200546 4923 flags.go:64] FLAG: --tls-cert-file="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200552 4923 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200560 4923 flags.go:64] FLAG: --tls-min-version="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200566 4923 flags.go:64] FLAG: --tls-private-key-file="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200571 4923 flags.go:64] FLAG: --topology-manager-policy="none" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200576 4923 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200581 4923 flags.go:64] FLAG: --topology-manager-scope="container" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200587 4923 flags.go:64] FLAG: --v="2" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200594 4923 flags.go:64] FLAG: --version="false" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200600 4923 flags.go:64] FLAG: --vmodule="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200607 4923 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.200613 4923 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200740 4923 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200747 4923 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200754 4923 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200759 4923 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200765 4923 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200770 4923 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200775 4923 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200780 4923 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200785 4923 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200789 4923 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200794 4923 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200799 4923 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200805 4923 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200811 4923 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200817 4923 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200822 4923 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200827 4923 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200833 4923 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200839 4923 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200843 4923 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200847 4923 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200851 4923 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200856 4923 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200860 4923 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200864 4923 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200868 4923 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200872 4923 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200876 4923 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200880 4923 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200884 4923 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200888 4923 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200893 4923 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200898 4923 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200903 4923 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200908 4923 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200914 4923 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200918 4923 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200925 4923 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200932 4923 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200966 4923 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200972 4923 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200977 4923 feature_gate.go:330] unrecognized feature gate: Example Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200982 4923 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200988 4923 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200993 4923 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.200998 4923 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201004 4923 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201009 4923 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201014 4923 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201019 4923 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201023 4923 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201028 4923 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201032 4923 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201037 4923 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201042 4923 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201048 4923 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201054 4923 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201059 4923 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201063 4923 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201071 4923 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201075 4923 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201079 4923 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201083 4923 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201088 4923 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201092 4923 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201096 4923 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201100 4923 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201104 4923 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201108 4923 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201112 4923 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.201115 4923 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.202012 4923 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.210115 4923 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.210143 4923 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210214 4923 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210221 4923 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210226 4923 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210233 4923 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210237 4923 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210242 4923 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210246 4923 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210250 4923 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210255 4923 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210259 4923 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210263 4923 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210267 4923 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210271 4923 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210275 4923 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210279 4923 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210283 4923 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210286 4923 feature_gate.go:330] unrecognized feature gate: Example Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210290 4923 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210305 4923 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210309 4923 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210313 4923 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210316 4923 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210320 4923 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210323 4923 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210327 4923 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210331 4923 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210335 4923 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210338 4923 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210342 4923 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210346 4923 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210349 4923 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210353 4923 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210357 4923 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210361 4923 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210366 4923 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210369 4923 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210373 4923 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210376 4923 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210380 4923 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210383 4923 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210387 4923 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210390 4923 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210394 4923 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210397 4923 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210401 4923 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210405 4923 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210408 4923 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210412 4923 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210415 4923 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210419 4923 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210422 4923 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210426 4923 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210430 4923 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210434 4923 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210438 4923 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210443 4923 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210447 4923 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210451 4923 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210455 4923 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210459 4923 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210464 4923 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210468 4923 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210474 4923 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210480 4923 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210485 4923 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210490 4923 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210495 4923 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210500 4923 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210504 4923 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210509 4923 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210515 4923 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.210523 4923 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210668 4923 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210676 4923 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210680 4923 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210684 4923 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210688 4923 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210692 4923 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210695 4923 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210699 4923 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210704 4923 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210707 4923 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210711 4923 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210715 4923 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210718 4923 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210722 4923 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210726 4923 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210730 4923 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210733 4923 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210737 4923 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210740 4923 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210744 4923 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210747 4923 feature_gate.go:330] unrecognized feature gate: Example Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210751 4923 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210754 4923 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210758 4923 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210762 4923 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210765 4923 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210769 4923 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210773 4923 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210777 4923 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210782 4923 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210787 4923 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210790 4923 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210794 4923 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210797 4923 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210801 4923 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210805 4923 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210809 4923 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210812 4923 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210816 4923 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210821 4923 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210826 4923 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210830 4923 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210834 4923 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210838 4923 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210843 4923 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210847 4923 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210851 4923 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210854 4923 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210858 4923 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210861 4923 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210865 4923 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210869 4923 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210872 4923 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210876 4923 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210880 4923 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210883 4923 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210887 4923 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210890 4923 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210894 4923 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210897 4923 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210901 4923 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210904 4923 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210908 4923 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210912 4923 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210915 4923 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210918 4923 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210922 4923 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210925 4923 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210929 4923 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210933 4923 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.210938 4923 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.210944 4923 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.211134 4923 server.go:940] "Client rotation is on, will bootstrap in background" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.236349 4923 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.236423 4923 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.239196 4923 server.go:997] "Starting client certificate rotation" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.239245 4923 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.242222 4923 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-26 05:52:25.830849019 +0000 UTC Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.242334 4923 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.368529 4923 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.373524 4923 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 24 02:54:37 crc kubenswrapper[4923]: E0224 02:54:37.382404 4923 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.401607 4923 log.go:25] "Validated CRI v1 runtime API" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.485173 4923 log.go:25] "Validated CRI v1 image API" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.489587 4923 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.498337 4923 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-24-02-49-46-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.498393 4923 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.544502 4923 manager.go:217] Machine: {Timestamp:2026-02-24 02:54:37.52392255 +0000 UTC m=+1.540993443 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:6ab8a4ca-6e04-4f42-b567-ee52d071b81a BootID:5c1ad024-4141-4b85-9d41-81c58856d2b4 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:f5:43:1d Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:f5:43:1d Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:77:87:9f Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:94:cf:15 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:6d:26:4d Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:4f:49:c3 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:5e:0b:1d:b0:3f:6b Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ee:98:2c:e8:7a:41 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.544929 4923 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.545160 4923 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.545734 4923 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.545928 4923 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.545960 4923 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.546183 4923 topology_manager.go:138] "Creating topology manager with none policy" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.546196 4923 container_manager_linux.go:303] "Creating device plugin manager" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.546820 4923 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.546853 4923 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.547079 4923 state_mem.go:36] "Initialized new in-memory state store" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.547215 4923 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.573247 4923 kubelet.go:418] "Attempting to sync node with API server" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.573293 4923 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.573395 4923 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.573426 4923 kubelet.go:324] "Adding apiserver pod source" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.573453 4923 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.579982 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:37 crc kubenswrapper[4923]: E0224 02:54:37.580272 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.581205 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:37 crc kubenswrapper[4923]: E0224 02:54:37.581490 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.596954 4923 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.598268 4923 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.601031 4923 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.604924 4923 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.604951 4923 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.604959 4923 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.604967 4923 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.604978 4923 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.604985 4923 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.604992 4923 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.605004 4923 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.605013 4923 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.605020 4923 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.605031 4923 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.605037 4923 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.606420 4923 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.606880 4923 server.go:1280] "Started kubelet" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.607036 4923 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 24 02:54:37 crc systemd[1]: Started Kubernetes Kubelet. Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.611401 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.612358 4923 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.612853 4923 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.613062 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.613098 4923 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.613287 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 05:03:56.496839509 +0000 UTC Feb 24 02:54:37 crc kubenswrapper[4923]: E0224 02:54:37.613341 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.613372 4923 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.613379 4923 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.613464 4923 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 24 02:54:37 crc kubenswrapper[4923]: E0224 02:54:37.613887 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="200ms" Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.614434 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.615351 4923 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.615375 4923 factory.go:55] Registering systemd factory Feb 24 02:54:37 crc kubenswrapper[4923]: E0224 02:54:37.615369 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.615385 4923 factory.go:221] Registration of the systemd container factory successfully Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.615622 4923 factory.go:153] Registering CRI-O factory Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.615637 4923 factory.go:221] Registration of the crio container factory successfully Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.615677 4923 factory.go:103] Registering Raw factory Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.615693 4923 manager.go:1196] Started watching for new ooms in manager Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.616677 4923 manager.go:319] Starting recovery of all containers Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.622478 4923 server.go:460] "Adding debug handlers to kubelet server" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.625898 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.625990 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.626011 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.626029 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.626048 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.626064 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.626077 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.626093 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.626112 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: E0224 02:54:37.635400 4923 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18970f3016ace512 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 02:54:37.606855954 +0000 UTC m=+1.623926757,LastTimestamp:2026-02-24 02:54:37.606855954 +0000 UTC m=+1.623926757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.635691 4923 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.635783 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.635822 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.635840 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.635857 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.635918 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.635944 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636008 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636030 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636059 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636072 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636096 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636110 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636128 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636149 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636163 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636179 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636202 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636344 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636369 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636393 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636409 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636432 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636450 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636465 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636484 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636499 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636515 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636534 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636549 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636572 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636588 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636604 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636624 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636671 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636694 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636711 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636725 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636745 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636759 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636777 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636792 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636806 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636826 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636853 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636874 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636898 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636917 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636938 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636966 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.636990 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637007 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637027 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637045 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637067 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637126 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637146 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637169 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637182 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637198 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637219 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637233 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637251 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637265 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637325 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637345 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637572 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637618 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637650 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637671 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637691 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637718 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637736 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637762 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637782 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637801 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637825 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637844 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637870 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637888 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637908 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.637993 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638014 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638044 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638062 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638087 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638114 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638140 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638163 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638185 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638204 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638226 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638244 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638262 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638282 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638369 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638414 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638522 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638554 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638594 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638617 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638650 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638677 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638700 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638726 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638763 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638788 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638806 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638829 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638892 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638912 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638936 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.638954 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639571 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639591 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639603 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639614 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639625 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639635 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639647 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639659 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639669 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639679 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639691 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639701 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639713 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639723 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639733 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639743 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639754 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639764 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639776 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639786 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639796 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639808 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639818 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.639828 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640100 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640155 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640166 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640178 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640189 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640199 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640209 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640220 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640230 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640241 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640278 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640288 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640312 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640321 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640331 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640341 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640350 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640377 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640388 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640590 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640607 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640619 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640631 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640642 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640651 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640661 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640672 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640683 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640692 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640704 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640731 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640741 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640750 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640760 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640771 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640781 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640791 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640802 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640812 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640822 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640832 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640843 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640854 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640865 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640876 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640887 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640899 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640910 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640919 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640930 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640941 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640953 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.640962 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.641076 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.641089 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.641099 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.641110 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.641126 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.641135 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.641148 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.641157 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.641167 4923 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.641177 4923 reconstruct.go:97] "Volume reconstruction finished" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.641186 4923 reconciler.go:26] "Reconciler: start to sync state" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.647353 4923 manager.go:324] Recovery completed Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.655539 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.657208 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.657256 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.657269 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.658089 4923 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.658201 4923 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.658283 4923 state_mem.go:36] "Initialized new in-memory state store" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.708817 4923 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.711775 4923 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.711812 4923 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.711834 4923 kubelet.go:2335] "Starting kubelet main sync loop" Feb 24 02:54:37 crc kubenswrapper[4923]: E0224 02:54:37.711875 4923 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 24 02:54:37 crc kubenswrapper[4923]: W0224 02:54:37.713191 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:37 crc kubenswrapper[4923]: E0224 02:54:37.713246 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Feb 24 02:54:37 crc kubenswrapper[4923]: E0224 02:54:37.713436 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.744116 4923 policy_none.go:49] "None policy: Start" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.744929 4923 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.744957 4923 state_mem.go:35] "Initializing new in-memory state store" Feb 24 02:54:37 crc kubenswrapper[4923]: E0224 02:54:37.812292 4923 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 24 02:54:37 crc kubenswrapper[4923]: E0224 02:54:37.814594 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:54:37 crc kubenswrapper[4923]: E0224 02:54:37.814763 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="400ms" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.833865 4923 manager.go:334] "Starting Device Plugin manager" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.833944 4923 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.833966 4923 server.go:79] "Starting device plugin registration server" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.834645 4923 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.834676 4923 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.834964 4923 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.835123 4923 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.835149 4923 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 24 02:54:37 crc kubenswrapper[4923]: E0224 02:54:37.840972 4923 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.935788 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.937717 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.937778 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.937802 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:37 crc kubenswrapper[4923]: I0224 02:54:37.937839 4923 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 02:54:37 crc kubenswrapper[4923]: E0224 02:54:37.938480 4923 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.013370 4923 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.013548 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.017673 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.017729 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.017744 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.017900 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.018120 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.018193 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.018991 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.019042 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.019061 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.019174 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.019212 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.019215 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.019319 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.019338 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.019373 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.020502 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.020540 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.020553 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.020632 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.020661 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.020642 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.020724 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.020752 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.020677 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.021680 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.021714 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.021726 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.022229 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.022269 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.022285 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.022445 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.022539 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.022569 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.023315 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.023345 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.023354 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.023528 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.023562 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.023793 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.023840 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.023858 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.024230 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.024260 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.024271 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.045491 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.045563 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.045589 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.139368 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.141565 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.141646 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.141674 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.141725 4923 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 02:54:38 crc kubenswrapper[4923]: E0224 02:54:38.142408 4923 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.146392 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.146438 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.146456 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.146473 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.146493 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.146507 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.146523 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.146540 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.146555 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.146576 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.146602 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.146698 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.146741 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.146775 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.146802 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.146830 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.146840 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.146877 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: E0224 02:54:38.216735 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="800ms" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.247874 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.248255 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.248406 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.249402 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.251858 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.252008 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.252024 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.252124 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.252151 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.252186 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.252360 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.252399 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.252440 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.252509 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.252530 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.252581 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.252599 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.252630 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.252691 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.253515 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.253566 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.253613 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.253661 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.253795 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.372844 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.403214 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.415803 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.466116 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: W0224 02:54:38.469022 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:38 crc kubenswrapper[4923]: E0224 02:54:38.469215 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.476961 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 24 02:54:38 crc kubenswrapper[4923]: W0224 02:54:38.496957 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:38 crc kubenswrapper[4923]: E0224 02:54:38.497089 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.543196 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.545598 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.545679 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.545698 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.545743 4923 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 02:54:38 crc kubenswrapper[4923]: E0224 02:54:38.546657 4923 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.613192 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.614203 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 17:08:51.160182612 +0000 UTC Feb 24 02:54:38 crc kubenswrapper[4923]: W0224 02:54:38.637690 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-491150ef17be2850b5429cb31ce7dc3da0123c54dd72a33b15134bdf783fcca6 WatchSource:0}: Error finding container 491150ef17be2850b5429cb31ce7dc3da0123c54dd72a33b15134bdf783fcca6: Status 404 returned error can't find the container with id 491150ef17be2850b5429cb31ce7dc3da0123c54dd72a33b15134bdf783fcca6 Feb 24 02:54:38 crc kubenswrapper[4923]: W0224 02:54:38.640100 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-cb22cc7dcfbbd0559352b65233e3b03d5139c6fe1430e7519dd33ec7bbdf4b0b WatchSource:0}: Error finding container cb22cc7dcfbbd0559352b65233e3b03d5139c6fe1430e7519dd33ec7bbdf4b0b: Status 404 returned error can't find the container with id cb22cc7dcfbbd0559352b65233e3b03d5139c6fe1430e7519dd33ec7bbdf4b0b Feb 24 02:54:38 crc kubenswrapper[4923]: W0224 02:54:38.642348 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-6b9d8ba7b0fee678471ab4f8970ae043140e7330d5ee06adfbae5bf2c70ea622 WatchSource:0}: Error finding container 6b9d8ba7b0fee678471ab4f8970ae043140e7330d5ee06adfbae5bf2c70ea622: Status 404 returned error can't find the container with id 6b9d8ba7b0fee678471ab4f8970ae043140e7330d5ee06adfbae5bf2c70ea622 Feb 24 02:54:38 crc kubenswrapper[4923]: W0224 02:54:38.645570 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-5b97224bed1435a83fc8439c861bd5f2fafb9da6c4e8f18c9b552f82708ae2a2 WatchSource:0}: Error finding container 5b97224bed1435a83fc8439c861bd5f2fafb9da6c4e8f18c9b552f82708ae2a2: Status 404 returned error can't find the container with id 5b97224bed1435a83fc8439c861bd5f2fafb9da6c4e8f18c9b552f82708ae2a2 Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.720280 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"491150ef17be2850b5429cb31ce7dc3da0123c54dd72a33b15134bdf783fcca6"} Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.722577 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cb22cc7dcfbbd0559352b65233e3b03d5139c6fe1430e7519dd33ec7bbdf4b0b"} Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.725399 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6b9d8ba7b0fee678471ab4f8970ae043140e7330d5ee06adfbae5bf2c70ea622"} Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.726801 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5b97224bed1435a83fc8439c861bd5f2fafb9da6c4e8f18c9b552f82708ae2a2"} Feb 24 02:54:38 crc kubenswrapper[4923]: I0224 02:54:38.728517 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b2d69d97f344033ae8798a162eaee4c32e492355a75a34cd25e7a1143ffabf30"} Feb 24 02:54:38 crc kubenswrapper[4923]: W0224 02:54:38.980160 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:38 crc kubenswrapper[4923]: E0224 02:54:38.980352 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Feb 24 02:54:39 crc kubenswrapper[4923]: E0224 02:54:39.017778 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="1.6s" Feb 24 02:54:39 crc kubenswrapper[4923]: W0224 02:54:39.027288 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:39 crc kubenswrapper[4923]: E0224 02:54:39.027672 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Feb 24 02:54:39 crc kubenswrapper[4923]: I0224 02:54:39.347589 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:39 crc kubenswrapper[4923]: I0224 02:54:39.349480 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:39 crc kubenswrapper[4923]: I0224 02:54:39.349577 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:39 crc kubenswrapper[4923]: I0224 02:54:39.349589 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:39 crc kubenswrapper[4923]: I0224 02:54:39.349626 4923 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 02:54:39 crc kubenswrapper[4923]: E0224 02:54:39.350383 4923 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Feb 24 02:54:39 crc kubenswrapper[4923]: I0224 02:54:39.456368 4923 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 24 02:54:39 crc kubenswrapper[4923]: E0224 02:54:39.458741 4923 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Feb 24 02:54:39 crc kubenswrapper[4923]: I0224 02:54:39.613471 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:39 crc kubenswrapper[4923]: I0224 02:54:39.614448 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 21:41:39.214153622 +0000 UTC Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.612462 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.614903 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 08:00:32.945366899 +0000 UTC Feb 24 02:54:40 crc kubenswrapper[4923]: E0224 02:54:40.619607 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="3.2s" Feb 24 02:54:40 crc kubenswrapper[4923]: W0224 02:54:40.724582 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:40 crc kubenswrapper[4923]: E0224 02:54:40.725152 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.735669 4923 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="66f786ac905ee8e1660aa8c9712af9df825a900ab829e58843a10c9f97ed5116" exitCode=0 Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.735763 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"66f786ac905ee8e1660aa8c9712af9df825a900ab829e58843a10c9f97ed5116"} Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.735853 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.737431 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.737474 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.737488 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.737515 4923 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="d1a4a40d58608f8ecaecb2d046e0706ec59405109c2184d10c56a72ce87877a6" exitCode=0 Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.737607 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"d1a4a40d58608f8ecaecb2d046e0706ec59405109c2184d10c56a72ce87877a6"} Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.737731 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.739057 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.739090 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.739100 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.740693 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"df468f367fa1345546fbfe1edec875c48e3fa9868dbe95b756a3984c8f1ee18f"} Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.746880 4923 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="9cc3852129f9dc15ad7fb080c121501e42480f5f8789a2d356fb01933d227bdf" exitCode=0 Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.747162 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.747439 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"9cc3852129f9dc15ad7fb080c121501e42480f5f8789a2d356fb01933d227bdf"} Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.748242 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.748277 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.748290 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.749070 4923 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fdcd9cb862270aaa40cacf54fe5ab0e4e7f234fd5de4b04b6c0395e393a4df1a" exitCode=0 Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.749121 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fdcd9cb862270aaa40cacf54fe5ab0e4e7f234fd5de4b04b6c0395e393a4df1a"} Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.749160 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.750404 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.750439 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.750452 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.753251 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.754388 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.754415 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.754427 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:40 crc kubenswrapper[4923]: W0224 02:54:40.918630 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:40 crc kubenswrapper[4923]: E0224 02:54:40.918742 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.951316 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.953352 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.953396 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.953410 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:40 crc kubenswrapper[4923]: I0224 02:54:40.953444 4923 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 02:54:40 crc kubenswrapper[4923]: E0224 02:54:40.954157 4923 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Feb 24 02:54:41 crc kubenswrapper[4923]: W0224 02:54:41.314800 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:41 crc kubenswrapper[4923]: E0224 02:54:41.314902 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Feb 24 02:54:41 crc kubenswrapper[4923]: W0224 02:54:41.536911 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:41 crc kubenswrapper[4923]: E0224 02:54:41.537052 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.612225 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.615062 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 18:24:05.560999784 +0000 UTC Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.753821 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d3afd1c22d06b9cf9b276c48fe92b74961d5fb547e2f6f47a34954a9b60ffe71"} Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.753856 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0194f6f381eef7fb97cd1b651bb4a4ad1843796bf7a9a785cf4980bf6b873366"} Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.753865 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2d40f7ee707e2d90cd0fb3ab208b21bdb801490dcb09e866e85f7f2675fa41c9"} Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.753867 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.754680 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.754705 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.754714 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.758482 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"289399f55c2f1b1d895f64d36a00d6664b31e5a87c5238eec012b73140d1c6e6"} Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.758521 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f3c44d8d664ba6ed57f842cc222128fb3ff294794a90e1162dbdf3395fa2b27c"} Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.758533 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5c45fb0cfe819bb8381e78f840d5fc12778d94056d6ef3440e24bd744b82534d"} Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.761173 4923 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5cc5f48a8993f066f0505be843f5ac769a864ea22a5c97ca82ce853454620be5" exitCode=0 Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.761232 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5cc5f48a8993f066f0505be843f5ac769a864ea22a5c97ca82ce853454620be5"} Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.761345 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.764388 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1574d397691fb5eb5de35a3c686e145ecd74e92d342ea1b0b6c4ec8e72f4a2a1"} Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.764491 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.765318 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.765502 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.765518 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.766438 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.766457 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.766465 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.768719 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cd7139b75d4f28e104f616496b1148e4ead7f238f33e1c57be9ffa535bf16f88"} Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.768745 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a19f6e3c15aaacdfeca6701d355c816e40fb25114a5772a1acb3cfc4db1b8a47"} Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.768757 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7d137bd6a0c3b9d5a8cca6e6322d446500abc1ae8a218f2572a7cb10b0258402"} Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.768784 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.769650 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.769671 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:41 crc kubenswrapper[4923]: I0224 02:54:41.769691 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.294333 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.612719 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.615934 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 08:13:21.563118967 +0000 UTC Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.620183 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:54:42 crc kubenswrapper[4923]: E0224 02:54:42.663083 4923 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18970f3016ace512 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 02:54:37.606855954 +0000 UTC m=+1.623926757,LastTimestamp:2026-02-24 02:54:37.606855954 +0000 UTC m=+1.623926757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.774542 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c01f8f2e84129c743beefcefad9444b0b673be195f66b553a887c7a4143496a2"} Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.774611 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3e85b4aad496c83fd9337a52bfc8c683c38cffd96f8a9bd033dde2dfcb16c2ae"} Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.774665 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.776120 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.776158 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.776169 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.777663 4923 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="50e60a944be060ae291916283fe72b437e90e09be6abafa4665313bf5211a6ae" exitCode=0 Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.777767 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.777819 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.777851 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.777868 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.778771 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.778815 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.778902 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.777718 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"50e60a944be060ae291916283fe72b437e90e09be6abafa4665313bf5211a6ae"} Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.778989 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.779108 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.779143 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.779155 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.779261 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.779280 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.779307 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.779551 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.779591 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:42 crc kubenswrapper[4923]: I0224 02:54:42.779603 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.149555 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.612440 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.617028 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 19:38:39.736976624 +0000 UTC Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.671823 4923 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 24 02:54:43 crc kubenswrapper[4923]: E0224 02:54:43.673005 4923 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.788440 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1c9db9d2122209742559d801a4a73bf2db34d1de1d12d4061e4f7d87f13f4a36"} Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.788498 4923 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.788513 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"664cb1a376bc9ece0e3dd6101d987e3ad376ae0b263d9af8e28c177414845f7a"} Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.788557 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.788571 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.788617 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.788785 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.791657 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.791716 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.791752 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.792363 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.792433 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.792462 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.792612 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.792674 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.792700 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:43 crc kubenswrapper[4923]: E0224 02:54:43.820640 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="6.4s" Feb 24 02:54:43 crc kubenswrapper[4923]: I0224 02:54:43.917119 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.155145 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.156729 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.156819 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.156840 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.156885 4923 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 02:54:44 crc kubenswrapper[4923]: E0224 02:54:44.157484 4923 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.617260 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 02:12:54.663983244 +0000 UTC Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.738033 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.796019 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5050fff3fa161a539387f524fb8907e9c9ae8ac783264226aa58f3fff40ebd40"} Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.796069 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8ddde4fececa0b1c0d8e8c8d6aa9825ac2f62f76ac4fa0c0c7fee6515bf75f4e"} Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.796084 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b4e1a4a39f5d169e08e7f62275a240e66d33dca6082591411e682ab7555d71c2"} Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.796157 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.797488 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.797523 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.797536 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.798852 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.801272 4923 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c01f8f2e84129c743beefcefad9444b0b673be195f66b553a887c7a4143496a2" exitCode=255 Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.801475 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.801480 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.801499 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c01f8f2e84129c743beefcefad9444b0b673be195f66b553a887c7a4143496a2"} Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.802885 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.802927 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.802930 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.802989 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.803005 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.802941 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:44 crc kubenswrapper[4923]: I0224 02:54:44.804214 4923 scope.go:117] "RemoveContainer" containerID="c01f8f2e84129c743beefcefad9444b0b673be195f66b553a887c7a4143496a2" Feb 24 02:54:45 crc kubenswrapper[4923]: I0224 02:54:45.529216 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:54:45 crc kubenswrapper[4923]: I0224 02:54:45.617908 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 16:09:27.331485054 +0000 UTC Feb 24 02:54:45 crc kubenswrapper[4923]: I0224 02:54:45.807037 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 24 02:54:45 crc kubenswrapper[4923]: I0224 02:54:45.809032 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:45 crc kubenswrapper[4923]: I0224 02:54:45.809054 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"23983eae01516397400180b9e48c38eaf550f4aa19efeabf2d145edf812cc869"} Feb 24 02:54:45 crc kubenswrapper[4923]: I0224 02:54:45.809040 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:45 crc kubenswrapper[4923]: I0224 02:54:45.809854 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:45 crc kubenswrapper[4923]: I0224 02:54:45.809880 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:45 crc kubenswrapper[4923]: I0224 02:54:45.809891 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:45 crc kubenswrapper[4923]: I0224 02:54:45.809982 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:45 crc kubenswrapper[4923]: I0224 02:54:45.810013 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:45 crc kubenswrapper[4923]: I0224 02:54:45.810026 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:46 crc kubenswrapper[4923]: I0224 02:54:46.618538 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 19:36:44.039928693 +0000 UTC Feb 24 02:54:46 crc kubenswrapper[4923]: I0224 02:54:46.702271 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:54:46 crc kubenswrapper[4923]: I0224 02:54:46.702528 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:46 crc kubenswrapper[4923]: I0224 02:54:46.703849 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:46 crc kubenswrapper[4923]: I0224 02:54:46.703882 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:46 crc kubenswrapper[4923]: I0224 02:54:46.703893 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:46 crc kubenswrapper[4923]: I0224 02:54:46.811089 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:46 crc kubenswrapper[4923]: I0224 02:54:46.811191 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:54:46 crc kubenswrapper[4923]: I0224 02:54:46.812493 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:46 crc kubenswrapper[4923]: I0224 02:54:46.812545 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:46 crc kubenswrapper[4923]: I0224 02:54:46.812559 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:47 crc kubenswrapper[4923]: I0224 02:54:47.247933 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 24 02:54:47 crc kubenswrapper[4923]: I0224 02:54:47.248099 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:47 crc kubenswrapper[4923]: I0224 02:54:47.249435 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:47 crc kubenswrapper[4923]: I0224 02:54:47.249484 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:47 crc kubenswrapper[4923]: I0224 02:54:47.249500 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:47 crc kubenswrapper[4923]: I0224 02:54:47.504244 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:54:47 crc kubenswrapper[4923]: I0224 02:54:47.618977 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 02:01:05.149264015 +0000 UTC Feb 24 02:54:47 crc kubenswrapper[4923]: I0224 02:54:47.813646 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:47 crc kubenswrapper[4923]: I0224 02:54:47.814961 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:47 crc kubenswrapper[4923]: I0224 02:54:47.815044 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:47 crc kubenswrapper[4923]: I0224 02:54:47.815069 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:47 crc kubenswrapper[4923]: E0224 02:54:47.841081 4923 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 02:54:48 crc kubenswrapper[4923]: I0224 02:54:48.619378 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 19:29:38.219738148 +0000 UTC Feb 24 02:54:48 crc kubenswrapper[4923]: I0224 02:54:48.816787 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:48 crc kubenswrapper[4923]: I0224 02:54:48.818397 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:48 crc kubenswrapper[4923]: I0224 02:54:48.818464 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:48 crc kubenswrapper[4923]: I0224 02:54:48.818483 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:49 crc kubenswrapper[4923]: I0224 02:54:49.620490 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 16:34:46.285724609 +0000 UTC Feb 24 02:54:49 crc kubenswrapper[4923]: I0224 02:54:49.703340 4923 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 02:54:49 crc kubenswrapper[4923]: I0224 02:54:49.703516 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 24 02:54:50 crc kubenswrapper[4923]: I0224 02:54:50.558115 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:50 crc kubenswrapper[4923]: I0224 02:54:50.560381 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:50 crc kubenswrapper[4923]: I0224 02:54:50.560460 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:50 crc kubenswrapper[4923]: I0224 02:54:50.560480 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:50 crc kubenswrapper[4923]: I0224 02:54:50.560523 4923 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 02:54:50 crc kubenswrapper[4923]: I0224 02:54:50.620741 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 07:42:56.42221223 +0000 UTC Feb 24 02:54:51 crc kubenswrapper[4923]: I0224 02:54:51.294507 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:54:51 crc kubenswrapper[4923]: I0224 02:54:51.294826 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:51 crc kubenswrapper[4923]: I0224 02:54:51.296657 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:51 crc kubenswrapper[4923]: I0224 02:54:51.296720 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:51 crc kubenswrapper[4923]: I0224 02:54:51.296741 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:51 crc kubenswrapper[4923]: I0224 02:54:51.406538 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 24 02:54:51 crc kubenswrapper[4923]: I0224 02:54:51.406809 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:51 crc kubenswrapper[4923]: I0224 02:54:51.408619 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:51 crc kubenswrapper[4923]: I0224 02:54:51.408710 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:51 crc kubenswrapper[4923]: I0224 02:54:51.408732 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:51 crc kubenswrapper[4923]: I0224 02:54:51.621645 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 00:43:16.537959621 +0000 UTC Feb 24 02:54:52 crc kubenswrapper[4923]: I0224 02:54:52.209342 4923 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 24 02:54:52 crc kubenswrapper[4923]: I0224 02:54:52.622960 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 16:16:12.632290686 +0000 UTC Feb 24 02:54:53 crc kubenswrapper[4923]: I0224 02:54:53.623783 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 06:41:07.024421844 +0000 UTC Feb 24 02:54:54 crc kubenswrapper[4923]: W0224 02:54:54.233939 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 24 02:54:54 crc kubenswrapper[4923]: I0224 02:54:54.234090 4923 trace.go:236] Trace[1477668686]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (24-Feb-2026 02:54:44.232) (total time: 10001ms): Feb 24 02:54:54 crc kubenswrapper[4923]: Trace[1477668686]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (02:54:54.233) Feb 24 02:54:54 crc kubenswrapper[4923]: Trace[1477668686]: [10.001634462s] [10.001634462s] END Feb 24 02:54:54 crc kubenswrapper[4923]: E0224 02:54:54.234134 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 24 02:54:54 crc kubenswrapper[4923]: I0224 02:54:54.614126 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 24 02:54:54 crc kubenswrapper[4923]: I0224 02:54:54.623972 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 18:46:21.09233706 +0000 UTC Feb 24 02:54:54 crc kubenswrapper[4923]: W0224 02:54:54.873982 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 24 02:54:54 crc kubenswrapper[4923]: I0224 02:54:54.874119 4923 trace.go:236] Trace[1374533334]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (24-Feb-2026 02:54:44.871) (total time: 10002ms): Feb 24 02:54:54 crc kubenswrapper[4923]: Trace[1374533334]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (02:54:54.873) Feb 24 02:54:54 crc kubenswrapper[4923]: Trace[1374533334]: [10.002286099s] [10.002286099s] END Feb 24 02:54:54 crc kubenswrapper[4923]: E0224 02:54:54.874152 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 24 02:54:55 crc kubenswrapper[4923]: I0224 02:54:55.530095 4923 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 24 02:54:55 crc kubenswrapper[4923]: I0224 02:54:55.530215 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 24 02:54:55 crc kubenswrapper[4923]: I0224 02:54:55.625614 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 05:31:06.842523473 +0000 UTC Feb 24 02:54:56 crc kubenswrapper[4923]: W0224 02:54:56.070364 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 24 02:54:56 crc kubenswrapper[4923]: I0224 02:54:56.070507 4923 trace.go:236] Trace[863688029]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (24-Feb-2026 02:54:46.068) (total time: 10002ms): Feb 24 02:54:56 crc kubenswrapper[4923]: Trace[863688029]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (02:54:56.070) Feb 24 02:54:56 crc kubenswrapper[4923]: Trace[863688029]: [10.002108106s] [10.002108106s] END Feb 24 02:54:56 crc kubenswrapper[4923]: E0224 02:54:56.070547 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 24 02:54:56 crc kubenswrapper[4923]: E0224 02:54:56.230887 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:54:56Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 24 02:54:56 crc kubenswrapper[4923]: E0224 02:54:56.232569 4923 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:54:56Z is after 2026-02-23T05:33:13Z" node="crc" Feb 24 02:54:56 crc kubenswrapper[4923]: E0224 02:54:56.236183 4923 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:54:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 02:54:56 crc kubenswrapper[4923]: E0224 02:54:56.238399 4923 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:54:56Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18970f3016ace512 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 02:54:37.606855954 +0000 UTC m=+1.623926757,LastTimestamp:2026-02-24 02:54:37.606855954 +0000 UTC m=+1.623926757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 02:54:56 crc kubenswrapper[4923]: W0224 02:54:56.240396 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:54:56Z is after 2026-02-23T05:33:13Z Feb 24 02:54:56 crc kubenswrapper[4923]: E0224 02:54:56.240519 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:54:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 02:54:56 crc kubenswrapper[4923]: I0224 02:54:56.245792 4923 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 24 02:54:56 crc kubenswrapper[4923]: I0224 02:54:56.245879 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 24 02:54:56 crc kubenswrapper[4923]: I0224 02:54:56.246775 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:54:56Z is after 2026-02-23T05:33:13Z Feb 24 02:54:56 crc kubenswrapper[4923]: I0224 02:54:56.257231 4923 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]log ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]etcd ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/generic-apiserver-start-informers ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/priority-and-fairness-filter ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/start-apiextensions-informers ok Feb 24 02:54:56 crc kubenswrapper[4923]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Feb 24 02:54:56 crc kubenswrapper[4923]: [-]poststarthook/crd-informer-synced failed: reason withheld Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/start-system-namespaces-controller ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 24 02:54:56 crc kubenswrapper[4923]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 24 02:54:56 crc kubenswrapper[4923]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Feb 24 02:54:56 crc kubenswrapper[4923]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Feb 24 02:54:56 crc kubenswrapper[4923]: [-]poststarthook/bootstrap-controller failed: reason withheld Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/start-kube-aggregator-informers ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 24 02:54:56 crc kubenswrapper[4923]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 24 02:54:56 crc kubenswrapper[4923]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]autoregister-completion ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/apiservice-openapi-controller ok Feb 24 02:54:56 crc kubenswrapper[4923]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 24 02:54:56 crc kubenswrapper[4923]: livez check failed Feb 24 02:54:56 crc kubenswrapper[4923]: I0224 02:54:56.257434 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 02:54:56 crc kubenswrapper[4923]: I0224 02:54:56.615999 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:54:56Z is after 2026-02-23T05:33:13Z Feb 24 02:54:56 crc kubenswrapper[4923]: I0224 02:54:56.627336 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 09:46:24.220000917 +0000 UTC Feb 24 02:54:56 crc kubenswrapper[4923]: I0224 02:54:56.843313 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 24 02:54:56 crc kubenswrapper[4923]: I0224 02:54:56.844155 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 24 02:54:56 crc kubenswrapper[4923]: I0224 02:54:56.847006 4923 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="23983eae01516397400180b9e48c38eaf550f4aa19efeabf2d145edf812cc869" exitCode=255 Feb 24 02:54:56 crc kubenswrapper[4923]: I0224 02:54:56.847069 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"23983eae01516397400180b9e48c38eaf550f4aa19efeabf2d145edf812cc869"} Feb 24 02:54:56 crc kubenswrapper[4923]: I0224 02:54:56.847155 4923 scope.go:117] "RemoveContainer" containerID="c01f8f2e84129c743beefcefad9444b0b673be195f66b553a887c7a4143496a2" Feb 24 02:54:56 crc kubenswrapper[4923]: I0224 02:54:56.847279 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:54:56 crc kubenswrapper[4923]: I0224 02:54:56.848353 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:54:56 crc kubenswrapper[4923]: I0224 02:54:56.848475 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:54:56 crc kubenswrapper[4923]: I0224 02:54:56.848566 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:54:56 crc kubenswrapper[4923]: I0224 02:54:56.850091 4923 scope.go:117] "RemoveContainer" containerID="23983eae01516397400180b9e48c38eaf550f4aa19efeabf2d145edf812cc869" Feb 24 02:54:56 crc kubenswrapper[4923]: E0224 02:54:56.850469 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 02:54:57 crc kubenswrapper[4923]: I0224 02:54:57.510251 4923 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]log ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]etcd ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/generic-apiserver-start-informers ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/priority-and-fairness-filter ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/start-apiextensions-informers ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/start-apiextensions-controllers ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/crd-informer-synced ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/start-system-namespaces-controller ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 24 02:54:57 crc kubenswrapper[4923]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/bootstrap-controller ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/start-kube-aggregator-informers ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/apiservice-registration-controller ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/apiservice-discovery-controller ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]autoregister-completion ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/apiservice-openapi-controller ok Feb 24 02:54:57 crc kubenswrapper[4923]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 24 02:54:57 crc kubenswrapper[4923]: livez check failed Feb 24 02:54:57 crc kubenswrapper[4923]: I0224 02:54:57.510401 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 02:54:57 crc kubenswrapper[4923]: I0224 02:54:57.617561 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:54:57Z is after 2026-02-23T05:33:13Z Feb 24 02:54:57 crc kubenswrapper[4923]: I0224 02:54:57.628289 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 02:44:54.644874626 +0000 UTC Feb 24 02:54:57 crc kubenswrapper[4923]: E0224 02:54:57.841583 4923 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 02:54:57 crc kubenswrapper[4923]: I0224 02:54:57.850926 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 24 02:54:58 crc kubenswrapper[4923]: I0224 02:54:58.619185 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:54:58Z is after 2026-02-23T05:33:13Z Feb 24 02:54:58 crc kubenswrapper[4923]: I0224 02:54:58.628543 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 02:34:39.360862752 +0000 UTC Feb 24 02:54:59 crc kubenswrapper[4923]: I0224 02:54:59.617785 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:54:59Z is after 2026-02-23T05:33:13Z Feb 24 02:54:59 crc kubenswrapper[4923]: I0224 02:54:59.628858 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 15:45:44.392885254 +0000 UTC Feb 24 02:54:59 crc kubenswrapper[4923]: I0224 02:54:59.703722 4923 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 02:54:59 crc kubenswrapper[4923]: I0224 02:54:59.703826 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 02:55:00 crc kubenswrapper[4923]: I0224 02:55:00.616874 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:00Z is after 2026-02-23T05:33:13Z Feb 24 02:55:00 crc kubenswrapper[4923]: I0224 02:55:00.628986 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 08:19:38.022230918 +0000 UTC Feb 24 02:55:01 crc kubenswrapper[4923]: I0224 02:55:01.457984 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 24 02:55:01 crc kubenswrapper[4923]: I0224 02:55:01.458276 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:01 crc kubenswrapper[4923]: I0224 02:55:01.459732 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:01 crc kubenswrapper[4923]: I0224 02:55:01.459771 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:01 crc kubenswrapper[4923]: I0224 02:55:01.459781 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:01 crc kubenswrapper[4923]: I0224 02:55:01.474113 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 24 02:55:01 crc kubenswrapper[4923]: I0224 02:55:01.618852 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:01Z is after 2026-02-23T05:33:13Z Feb 24 02:55:01 crc kubenswrapper[4923]: I0224 02:55:01.630036 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 05:25:30.088686314 +0000 UTC Feb 24 02:55:01 crc kubenswrapper[4923]: I0224 02:55:01.868890 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:01 crc kubenswrapper[4923]: I0224 02:55:01.870104 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:01 crc kubenswrapper[4923]: I0224 02:55:01.870216 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:01 crc kubenswrapper[4923]: I0224 02:55:01.870236 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:02 crc kubenswrapper[4923]: I0224 02:55:02.514988 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:55:02 crc kubenswrapper[4923]: I0224 02:55:02.515278 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:02 crc kubenswrapper[4923]: I0224 02:55:02.517353 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:02 crc kubenswrapper[4923]: I0224 02:55:02.517420 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:02 crc kubenswrapper[4923]: I0224 02:55:02.517439 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:02 crc kubenswrapper[4923]: I0224 02:55:02.518238 4923 scope.go:117] "RemoveContainer" containerID="23983eae01516397400180b9e48c38eaf550f4aa19efeabf2d145edf812cc869" Feb 24 02:55:02 crc kubenswrapper[4923]: E0224 02:55:02.518592 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 02:55:02 crc kubenswrapper[4923]: I0224 02:55:02.523618 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:55:02 crc kubenswrapper[4923]: I0224 02:55:02.617486 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:02Z is after 2026-02-23T05:33:13Z Feb 24 02:55:02 crc kubenswrapper[4923]: I0224 02:55:02.631198 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 04:02:11.37203485 +0000 UTC Feb 24 02:55:02 crc kubenswrapper[4923]: I0224 02:55:02.872409 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:02 crc kubenswrapper[4923]: I0224 02:55:02.874080 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:02 crc kubenswrapper[4923]: I0224 02:55:02.874132 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:02 crc kubenswrapper[4923]: I0224 02:55:02.874150 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:02 crc kubenswrapper[4923]: I0224 02:55:02.875163 4923 scope.go:117] "RemoveContainer" containerID="23983eae01516397400180b9e48c38eaf550f4aa19efeabf2d145edf812cc869" Feb 24 02:55:02 crc kubenswrapper[4923]: E0224 02:55:02.875543 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 02:55:03 crc kubenswrapper[4923]: I0224 02:55:03.232686 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:03 crc kubenswrapper[4923]: I0224 02:55:03.234185 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:03 crc kubenswrapper[4923]: I0224 02:55:03.234255 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:03 crc kubenswrapper[4923]: I0224 02:55:03.234280 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:03 crc kubenswrapper[4923]: I0224 02:55:03.234354 4923 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 02:55:03 crc kubenswrapper[4923]: E0224 02:55:03.237796 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:03Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 24 02:55:03 crc kubenswrapper[4923]: E0224 02:55:03.240688 4923 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:03Z is after 2026-02-23T05:33:13Z" node="crc" Feb 24 02:55:03 crc kubenswrapper[4923]: I0224 02:55:03.616707 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:03Z is after 2026-02-23T05:33:13Z Feb 24 02:55:03 crc kubenswrapper[4923]: I0224 02:55:03.631878 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 07:56:49.572976539 +0000 UTC Feb 24 02:55:04 crc kubenswrapper[4923]: I0224 02:55:04.618559 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:04Z is after 2026-02-23T05:33:13Z Feb 24 02:55:04 crc kubenswrapper[4923]: I0224 02:55:04.633021 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 20:49:07.4731234 +0000 UTC Feb 24 02:55:05 crc kubenswrapper[4923]: W0224 02:55:05.521903 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:05Z is after 2026-02-23T05:33:13Z Feb 24 02:55:05 crc kubenswrapper[4923]: E0224 02:55:05.521981 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 02:55:05 crc kubenswrapper[4923]: I0224 02:55:05.529557 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:55:05 crc kubenswrapper[4923]: I0224 02:55:05.529683 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:05 crc kubenswrapper[4923]: I0224 02:55:05.530595 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:05 crc kubenswrapper[4923]: I0224 02:55:05.530622 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:05 crc kubenswrapper[4923]: I0224 02:55:05.530631 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:05 crc kubenswrapper[4923]: I0224 02:55:05.531062 4923 scope.go:117] "RemoveContainer" containerID="23983eae01516397400180b9e48c38eaf550f4aa19efeabf2d145edf812cc869" Feb 24 02:55:05 crc kubenswrapper[4923]: E0224 02:55:05.531206 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 02:55:05 crc kubenswrapper[4923]: I0224 02:55:05.614941 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:05Z is after 2026-02-23T05:33:13Z Feb 24 02:55:05 crc kubenswrapper[4923]: I0224 02:55:05.634032 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 08:35:10.964029294 +0000 UTC Feb 24 02:55:05 crc kubenswrapper[4923]: W0224 02:55:05.701923 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:05Z is after 2026-02-23T05:33:13Z Feb 24 02:55:05 crc kubenswrapper[4923]: E0224 02:55:05.702111 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 02:55:06 crc kubenswrapper[4923]: E0224 02:55:06.244629 4923 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:06Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18970f3016ace512 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 02:54:37.606855954 +0000 UTC m=+1.623926757,LastTimestamp:2026-02-24 02:54:37.606855954 +0000 UTC m=+1.623926757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 02:55:06 crc kubenswrapper[4923]: W0224 02:55:06.595963 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:06Z is after 2026-02-23T05:33:13Z Feb 24 02:55:06 crc kubenswrapper[4923]: E0224 02:55:06.596116 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:06Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 02:55:06 crc kubenswrapper[4923]: I0224 02:55:06.618075 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:06Z is after 2026-02-23T05:33:13Z Feb 24 02:55:06 crc kubenswrapper[4923]: I0224 02:55:06.634137 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 22:31:12.180669735 +0000 UTC Feb 24 02:55:07 crc kubenswrapper[4923]: I0224 02:55:07.617281 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:07Z is after 2026-02-23T05:33:13Z Feb 24 02:55:07 crc kubenswrapper[4923]: I0224 02:55:07.634512 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 11:10:36.697329089 +0000 UTC Feb 24 02:55:07 crc kubenswrapper[4923]: E0224 02:55:07.841764 4923 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 02:55:07 crc kubenswrapper[4923]: W0224 02:55:07.987684 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:07Z is after 2026-02-23T05:33:13Z Feb 24 02:55:07 crc kubenswrapper[4923]: E0224 02:55:07.987806 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 02:55:08 crc kubenswrapper[4923]: I0224 02:55:08.617195 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:08Z is after 2026-02-23T05:33:13Z Feb 24 02:55:08 crc kubenswrapper[4923]: I0224 02:55:08.635670 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 12:06:27.158949513 +0000 UTC Feb 24 02:55:09 crc kubenswrapper[4923]: I0224 02:55:09.616054 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:09Z is after 2026-02-23T05:33:13Z Feb 24 02:55:09 crc kubenswrapper[4923]: I0224 02:55:09.636450 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 11:54:52.397096645 +0000 UTC Feb 24 02:55:09 crc kubenswrapper[4923]: I0224 02:55:09.703185 4923 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 02:55:09 crc kubenswrapper[4923]: I0224 02:55:09.703264 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 02:55:09 crc kubenswrapper[4923]: I0224 02:55:09.703426 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:55:09 crc kubenswrapper[4923]: I0224 02:55:09.703656 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:09 crc kubenswrapper[4923]: I0224 02:55:09.705369 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:09 crc kubenswrapper[4923]: I0224 02:55:09.705408 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:09 crc kubenswrapper[4923]: I0224 02:55:09.705423 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:09 crc kubenswrapper[4923]: I0224 02:55:09.705991 4923 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"7d137bd6a0c3b9d5a8cca6e6322d446500abc1ae8a218f2572a7cb10b0258402"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 24 02:55:09 crc kubenswrapper[4923]: I0224 02:55:09.706191 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://7d137bd6a0c3b9d5a8cca6e6322d446500abc1ae8a218f2572a7cb10b0258402" gracePeriod=30 Feb 24 02:55:09 crc kubenswrapper[4923]: I0224 02:55:09.900606 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 24 02:55:09 crc kubenswrapper[4923]: I0224 02:55:09.901390 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"7d137bd6a0c3b9d5a8cca6e6322d446500abc1ae8a218f2572a7cb10b0258402"} Feb 24 02:55:09 crc kubenswrapper[4923]: I0224 02:55:09.901441 4923 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="7d137bd6a0c3b9d5a8cca6e6322d446500abc1ae8a218f2572a7cb10b0258402" exitCode=255 Feb 24 02:55:10 crc kubenswrapper[4923]: I0224 02:55:10.241723 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:10 crc kubenswrapper[4923]: I0224 02:55:10.244058 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:10 crc kubenswrapper[4923]: I0224 02:55:10.244118 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:10 crc kubenswrapper[4923]: I0224 02:55:10.244133 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:10 crc kubenswrapper[4923]: I0224 02:55:10.244161 4923 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 02:55:10 crc kubenswrapper[4923]: E0224 02:55:10.246653 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:10Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 24 02:55:10 crc kubenswrapper[4923]: E0224 02:55:10.250851 4923 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:10Z is after 2026-02-23T05:33:13Z" node="crc" Feb 24 02:55:10 crc kubenswrapper[4923]: I0224 02:55:10.618213 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:10Z is after 2026-02-23T05:33:13Z Feb 24 02:55:10 crc kubenswrapper[4923]: I0224 02:55:10.637029 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 16:18:21.041565776 +0000 UTC Feb 24 02:55:10 crc kubenswrapper[4923]: I0224 02:55:10.907379 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 24 02:55:10 crc kubenswrapper[4923]: I0224 02:55:10.907951 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d119205e5e39d6310003728f67f682a6da3eceac55283fd442da055497d26bb7"} Feb 24 02:55:10 crc kubenswrapper[4923]: I0224 02:55:10.908189 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:10 crc kubenswrapper[4923]: I0224 02:55:10.909459 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:10 crc kubenswrapper[4923]: I0224 02:55:10.909504 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:10 crc kubenswrapper[4923]: I0224 02:55:10.909522 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:11 crc kubenswrapper[4923]: I0224 02:55:11.616122 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:11Z is after 2026-02-23T05:33:13Z Feb 24 02:55:11 crc kubenswrapper[4923]: I0224 02:55:11.638078 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 02:07:28.484610923 +0000 UTC Feb 24 02:55:11 crc kubenswrapper[4923]: I0224 02:55:11.911001 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:11 crc kubenswrapper[4923]: I0224 02:55:11.912131 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:11 crc kubenswrapper[4923]: I0224 02:55:11.912204 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:11 crc kubenswrapper[4923]: I0224 02:55:11.912235 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:12 crc kubenswrapper[4923]: I0224 02:55:12.285495 4923 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 24 02:55:12 crc kubenswrapper[4923]: E0224 02:55:12.291007 4923 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 02:55:12 crc kubenswrapper[4923]: E0224 02:55:12.292186 4923 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Feb 24 02:55:12 crc kubenswrapper[4923]: I0224 02:55:12.614955 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:12Z is after 2026-02-23T05:33:13Z Feb 24 02:55:12 crc kubenswrapper[4923]: I0224 02:55:12.621269 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:55:12 crc kubenswrapper[4923]: I0224 02:55:12.638358 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 21:11:46.459193668 +0000 UTC Feb 24 02:55:12 crc kubenswrapper[4923]: I0224 02:55:12.913347 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:12 crc kubenswrapper[4923]: I0224 02:55:12.914474 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:12 crc kubenswrapper[4923]: I0224 02:55:12.914650 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:12 crc kubenswrapper[4923]: I0224 02:55:12.914772 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:13 crc kubenswrapper[4923]: I0224 02:55:13.616054 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:13Z is after 2026-02-23T05:33:13Z Feb 24 02:55:13 crc kubenswrapper[4923]: I0224 02:55:13.639383 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 06:54:58.596676846 +0000 UTC Feb 24 02:55:14 crc kubenswrapper[4923]: I0224 02:55:14.615443 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:14Z is after 2026-02-23T05:33:13Z Feb 24 02:55:14 crc kubenswrapper[4923]: I0224 02:55:14.640630 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 17:01:50.006355788 +0000 UTC Feb 24 02:55:15 crc kubenswrapper[4923]: I0224 02:55:15.616386 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:15Z is after 2026-02-23T05:33:13Z Feb 24 02:55:15 crc kubenswrapper[4923]: I0224 02:55:15.641166 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 12:02:14.368566694 +0000 UTC Feb 24 02:55:16 crc kubenswrapper[4923]: E0224 02:55:16.250837 4923 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:16Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18970f3016ace512 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 02:54:37.606855954 +0000 UTC m=+1.623926757,LastTimestamp:2026-02-24 02:54:37.606855954 +0000 UTC m=+1.623926757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 02:55:16 crc kubenswrapper[4923]: I0224 02:55:16.617414 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:16Z is after 2026-02-23T05:33:13Z Feb 24 02:55:16 crc kubenswrapper[4923]: I0224 02:55:16.641575 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 04:57:12.446002576 +0000 UTC Feb 24 02:55:16 crc kubenswrapper[4923]: I0224 02:55:16.703149 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:55:16 crc kubenswrapper[4923]: I0224 02:55:16.703613 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:16 crc kubenswrapper[4923]: I0224 02:55:16.704840 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:16 crc kubenswrapper[4923]: I0224 02:55:16.704899 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:16 crc kubenswrapper[4923]: I0224 02:55:16.704920 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:16 crc kubenswrapper[4923]: I0224 02:55:16.712594 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:16 crc kubenswrapper[4923]: I0224 02:55:16.713503 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:16 crc kubenswrapper[4923]: I0224 02:55:16.713601 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:16 crc kubenswrapper[4923]: I0224 02:55:16.713686 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:16 crc kubenswrapper[4923]: I0224 02:55:16.714273 4923 scope.go:117] "RemoveContainer" containerID="23983eae01516397400180b9e48c38eaf550f4aa19efeabf2d145edf812cc869" Feb 24 02:55:16 crc kubenswrapper[4923]: I0224 02:55:16.926838 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 24 02:55:17 crc kubenswrapper[4923]: I0224 02:55:17.251195 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:17 crc kubenswrapper[4923]: E0224 02:55:17.253106 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:17Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 24 02:55:17 crc kubenswrapper[4923]: I0224 02:55:17.253348 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:17 crc kubenswrapper[4923]: I0224 02:55:17.253413 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:17 crc kubenswrapper[4923]: I0224 02:55:17.253440 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:17 crc kubenswrapper[4923]: I0224 02:55:17.253479 4923 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 02:55:17 crc kubenswrapper[4923]: E0224 02:55:17.258943 4923 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:17Z is after 2026-02-23T05:33:13Z" node="crc" Feb 24 02:55:17 crc kubenswrapper[4923]: I0224 02:55:17.616286 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:17Z is after 2026-02-23T05:33:13Z Feb 24 02:55:17 crc kubenswrapper[4923]: I0224 02:55:17.642570 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 10:38:18.79673732 +0000 UTC Feb 24 02:55:17 crc kubenswrapper[4923]: E0224 02:55:17.842669 4923 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 02:55:17 crc kubenswrapper[4923]: I0224 02:55:17.933966 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 24 02:55:17 crc kubenswrapper[4923]: I0224 02:55:17.936445 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 24 02:55:17 crc kubenswrapper[4923]: I0224 02:55:17.941007 4923 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="224098d1ce006fab873c1fa15bbccab0a02b2912364a665a23708454cf850c63" exitCode=255 Feb 24 02:55:17 crc kubenswrapper[4923]: I0224 02:55:17.941072 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"224098d1ce006fab873c1fa15bbccab0a02b2912364a665a23708454cf850c63"} Feb 24 02:55:17 crc kubenswrapper[4923]: I0224 02:55:17.941126 4923 scope.go:117] "RemoveContainer" containerID="23983eae01516397400180b9e48c38eaf550f4aa19efeabf2d145edf812cc869" Feb 24 02:55:17 crc kubenswrapper[4923]: I0224 02:55:17.941389 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:17 crc kubenswrapper[4923]: I0224 02:55:17.942685 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:17 crc kubenswrapper[4923]: I0224 02:55:17.942728 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:17 crc kubenswrapper[4923]: I0224 02:55:17.942747 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:17 crc kubenswrapper[4923]: I0224 02:55:17.943566 4923 scope.go:117] "RemoveContainer" containerID="224098d1ce006fab873c1fa15bbccab0a02b2912364a665a23708454cf850c63" Feb 24 02:55:17 crc kubenswrapper[4923]: E0224 02:55:17.943857 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 02:55:18 crc kubenswrapper[4923]: I0224 02:55:18.614633 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:18Z is after 2026-02-23T05:33:13Z Feb 24 02:55:18 crc kubenswrapper[4923]: I0224 02:55:18.642737 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 09:38:12.83933987 +0000 UTC Feb 24 02:55:18 crc kubenswrapper[4923]: I0224 02:55:18.945817 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 24 02:55:19 crc kubenswrapper[4923]: I0224 02:55:19.616168 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:19Z is after 2026-02-23T05:33:13Z Feb 24 02:55:19 crc kubenswrapper[4923]: I0224 02:55:19.643391 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 03:59:59.158568913 +0000 UTC Feb 24 02:55:19 crc kubenswrapper[4923]: I0224 02:55:19.703910 4923 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 02:55:19 crc kubenswrapper[4923]: I0224 02:55:19.703997 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 02:55:20 crc kubenswrapper[4923]: I0224 02:55:20.615254 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:20Z is after 2026-02-23T05:33:13Z Feb 24 02:55:20 crc kubenswrapper[4923]: I0224 02:55:20.644416 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 02:07:30.171777304 +0000 UTC Feb 24 02:55:20 crc kubenswrapper[4923]: W0224 02:55:20.652617 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:20Z is after 2026-02-23T05:33:13Z Feb 24 02:55:20 crc kubenswrapper[4923]: E0224 02:55:20.652689 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 02:55:21 crc kubenswrapper[4923]: I0224 02:55:21.614440 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:21Z is after 2026-02-23T05:33:13Z Feb 24 02:55:21 crc kubenswrapper[4923]: I0224 02:55:21.644983 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 06:06:17.559712842 +0000 UTC Feb 24 02:55:22 crc kubenswrapper[4923]: I0224 02:55:22.617258 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:22Z is after 2026-02-23T05:33:13Z Feb 24 02:55:22 crc kubenswrapper[4923]: I0224 02:55:22.645627 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 08:23:48.280910464 +0000 UTC Feb 24 02:55:23 crc kubenswrapper[4923]: I0224 02:55:23.616092 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:23Z is after 2026-02-23T05:33:13Z Feb 24 02:55:23 crc kubenswrapper[4923]: I0224 02:55:23.646423 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 16:38:24.139320669 +0000 UTC Feb 24 02:55:24 crc kubenswrapper[4923]: E0224 02:55:24.256493 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:24Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 24 02:55:24 crc kubenswrapper[4923]: I0224 02:55:24.647196 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 02:14:31.815542638 +0000 UTC Feb 24 02:55:24 crc kubenswrapper[4923]: I0224 02:55:24.824074 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:55:24 crc kubenswrapper[4923]: I0224 02:55:24.824276 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:24 crc kubenswrapper[4923]: I0224 02:55:24.824332 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:24 crc kubenswrapper[4923]: I0224 02:55:24.825374 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:24 crc kubenswrapper[4923]: I0224 02:55:24.825405 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:24 crc kubenswrapper[4923]: I0224 02:55:24.825413 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:24 crc kubenswrapper[4923]: I0224 02:55:24.825438 4923 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 02:55:24 crc kubenswrapper[4923]: I0224 02:55:24.825604 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:24 crc kubenswrapper[4923]: I0224 02:55:24.825693 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:24 crc kubenswrapper[4923]: I0224 02:55:24.825772 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:24 crc kubenswrapper[4923]: I0224 02:55:24.826251 4923 scope.go:117] "RemoveContainer" containerID="224098d1ce006fab873c1fa15bbccab0a02b2912364a665a23708454cf850c63" Feb 24 02:55:24 crc kubenswrapper[4923]: E0224 02:55:24.826486 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 02:55:24 crc kubenswrapper[4923]: I0224 02:55:24.828417 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:24Z is after 2026-02-23T05:33:13Z Feb 24 02:55:24 crc kubenswrapper[4923]: E0224 02:55:24.831988 4923 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:24Z is after 2026-02-23T05:33:13Z" node="crc" Feb 24 02:55:24 crc kubenswrapper[4923]: W0224 02:55:24.867660 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:24Z is after 2026-02-23T05:33:13Z Feb 24 02:55:24 crc kubenswrapper[4923]: E0224 02:55:24.867729 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 02:55:25 crc kubenswrapper[4923]: I0224 02:55:25.529511 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:55:25 crc kubenswrapper[4923]: I0224 02:55:25.529708 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:25 crc kubenswrapper[4923]: I0224 02:55:25.531358 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:25 crc kubenswrapper[4923]: I0224 02:55:25.531407 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:25 crc kubenswrapper[4923]: I0224 02:55:25.531431 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:25 crc kubenswrapper[4923]: I0224 02:55:25.532144 4923 scope.go:117] "RemoveContainer" containerID="224098d1ce006fab873c1fa15bbccab0a02b2912364a665a23708454cf850c63" Feb 24 02:55:25 crc kubenswrapper[4923]: E0224 02:55:25.532527 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 02:55:25 crc kubenswrapper[4923]: I0224 02:55:25.617113 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:25Z is after 2026-02-23T05:33:13Z Feb 24 02:55:25 crc kubenswrapper[4923]: I0224 02:55:25.648292 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 16:40:13.417677767 +0000 UTC Feb 24 02:55:26 crc kubenswrapper[4923]: W0224 02:55:26.186718 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:26Z is after 2026-02-23T05:33:13Z Feb 24 02:55:26 crc kubenswrapper[4923]: E0224 02:55:26.186832 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 02:55:26 crc kubenswrapper[4923]: E0224 02:55:26.256466 4923 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:26Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18970f3016ace512 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 02:54:37.606855954 +0000 UTC m=+1.623926757,LastTimestamp:2026-02-24 02:54:37.606855954 +0000 UTC m=+1.623926757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 02:55:26 crc kubenswrapper[4923]: I0224 02:55:26.616436 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:26Z is after 2026-02-23T05:33:13Z Feb 24 02:55:26 crc kubenswrapper[4923]: I0224 02:55:26.648869 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 12:51:59.594993135 +0000 UTC Feb 24 02:55:27 crc kubenswrapper[4923]: I0224 02:55:27.614126 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:27Z is after 2026-02-23T05:33:13Z Feb 24 02:55:27 crc kubenswrapper[4923]: I0224 02:55:27.649393 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 23:25:58.665248709 +0000 UTC Feb 24 02:55:27 crc kubenswrapper[4923]: E0224 02:55:27.843893 4923 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 02:55:28 crc kubenswrapper[4923]: I0224 02:55:28.616483 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:28Z is after 2026-02-23T05:33:13Z Feb 24 02:55:28 crc kubenswrapper[4923]: I0224 02:55:28.649841 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 15:11:36.009267206 +0000 UTC Feb 24 02:55:29 crc kubenswrapper[4923]: I0224 02:55:29.615685 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:29Z is after 2026-02-23T05:33:13Z Feb 24 02:55:29 crc kubenswrapper[4923]: I0224 02:55:29.650887 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 14:32:18.456275435 +0000 UTC Feb 24 02:55:29 crc kubenswrapper[4923]: I0224 02:55:29.703471 4923 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 02:55:29 crc kubenswrapper[4923]: I0224 02:55:29.703569 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 02:55:30 crc kubenswrapper[4923]: W0224 02:55:30.068567 4923 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:30Z is after 2026-02-23T05:33:13Z Feb 24 02:55:30 crc kubenswrapper[4923]: E0224 02:55:30.068666 4923 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 02:55:30 crc kubenswrapper[4923]: I0224 02:55:30.617327 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:30Z is after 2026-02-23T05:33:13Z Feb 24 02:55:30 crc kubenswrapper[4923]: I0224 02:55:30.651038 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 05:13:14.157133268 +0000 UTC Feb 24 02:55:31 crc kubenswrapper[4923]: E0224 02:55:31.262189 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:31Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 24 02:55:31 crc kubenswrapper[4923]: I0224 02:55:31.562343 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 02:55:31 crc kubenswrapper[4923]: I0224 02:55:31.562489 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:31 crc kubenswrapper[4923]: I0224 02:55:31.564374 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:31 crc kubenswrapper[4923]: I0224 02:55:31.564442 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:31 crc kubenswrapper[4923]: I0224 02:55:31.564464 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:31 crc kubenswrapper[4923]: I0224 02:55:31.616961 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:31Z is after 2026-02-23T05:33:13Z Feb 24 02:55:31 crc kubenswrapper[4923]: I0224 02:55:31.651231 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 07:23:43.406356538 +0000 UTC Feb 24 02:55:31 crc kubenswrapper[4923]: I0224 02:55:31.832926 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:31 crc kubenswrapper[4923]: I0224 02:55:31.834915 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:31 crc kubenswrapper[4923]: I0224 02:55:31.835018 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:31 crc kubenswrapper[4923]: I0224 02:55:31.835046 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:31 crc kubenswrapper[4923]: I0224 02:55:31.835111 4923 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 02:55:31 crc kubenswrapper[4923]: E0224 02:55:31.840640 4923 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:31Z is after 2026-02-23T05:33:13Z" node="crc" Feb 24 02:55:32 crc kubenswrapper[4923]: I0224 02:55:32.617566 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:32Z is after 2026-02-23T05:33:13Z Feb 24 02:55:32 crc kubenswrapper[4923]: I0224 02:55:32.652255 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 18:52:40.448725916 +0000 UTC Feb 24 02:55:33 crc kubenswrapper[4923]: I0224 02:55:33.615831 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:33Z is after 2026-02-23T05:33:13Z Feb 24 02:55:33 crc kubenswrapper[4923]: I0224 02:55:33.653141 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 07:37:38.315978026 +0000 UTC Feb 24 02:55:34 crc kubenswrapper[4923]: I0224 02:55:34.617383 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:34Z is after 2026-02-23T05:33:13Z Feb 24 02:55:34 crc kubenswrapper[4923]: I0224 02:55:34.654069 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 17:39:07.594650001 +0000 UTC Feb 24 02:55:35 crc kubenswrapper[4923]: I0224 02:55:35.617041 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:35Z is after 2026-02-23T05:33:13Z Feb 24 02:55:35 crc kubenswrapper[4923]: I0224 02:55:35.655285 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 15:01:28.551306467 +0000 UTC Feb 24 02:55:36 crc kubenswrapper[4923]: E0224 02:55:36.262375 4923 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:36Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18970f3016ace512 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 02:54:37.606855954 +0000 UTC m=+1.623926757,LastTimestamp:2026-02-24 02:54:37.606855954 +0000 UTC m=+1.623926757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 02:55:36 crc kubenswrapper[4923]: I0224 02:55:36.616853 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:36Z is after 2026-02-23T05:33:13Z Feb 24 02:55:36 crc kubenswrapper[4923]: I0224 02:55:36.655802 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 15:04:58.528283663 +0000 UTC Feb 24 02:55:36 crc kubenswrapper[4923]: I0224 02:55:36.713159 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:36 crc kubenswrapper[4923]: I0224 02:55:36.714715 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:36 crc kubenswrapper[4923]: I0224 02:55:36.714779 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:36 crc kubenswrapper[4923]: I0224 02:55:36.714792 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:36 crc kubenswrapper[4923]: I0224 02:55:36.715559 4923 scope.go:117] "RemoveContainer" containerID="224098d1ce006fab873c1fa15bbccab0a02b2912364a665a23708454cf850c63" Feb 24 02:55:36 crc kubenswrapper[4923]: E0224 02:55:36.715813 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 02:55:37 crc kubenswrapper[4923]: I0224 02:55:37.615364 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:37Z is after 2026-02-23T05:33:13Z Feb 24 02:55:37 crc kubenswrapper[4923]: I0224 02:55:37.656607 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 20:25:01.409133099 +0000 UTC Feb 24 02:55:37 crc kubenswrapper[4923]: E0224 02:55:37.844386 4923 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 02:55:38 crc kubenswrapper[4923]: E0224 02:55:38.268751 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:38Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 24 02:55:38 crc kubenswrapper[4923]: I0224 02:55:38.617655 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:38Z is after 2026-02-23T05:33:13Z Feb 24 02:55:38 crc kubenswrapper[4923]: I0224 02:55:38.657365 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 14:02:09.536812707 +0000 UTC Feb 24 02:55:38 crc kubenswrapper[4923]: I0224 02:55:38.841400 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:38 crc kubenswrapper[4923]: I0224 02:55:38.843207 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:38 crc kubenswrapper[4923]: I0224 02:55:38.843298 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:38 crc kubenswrapper[4923]: I0224 02:55:38.843356 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:38 crc kubenswrapper[4923]: I0224 02:55:38.843401 4923 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 02:55:38 crc kubenswrapper[4923]: E0224 02:55:38.847137 4923 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:38Z is after 2026-02-23T05:33:13Z" node="crc" Feb 24 02:55:39 crc kubenswrapper[4923]: I0224 02:55:39.615087 4923 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:55:39Z is after 2026-02-23T05:33:13Z Feb 24 02:55:39 crc kubenswrapper[4923]: I0224 02:55:39.658170 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 11:08:32.751797472 +0000 UTC Feb 24 02:55:39 crc kubenswrapper[4923]: I0224 02:55:39.700760 4923 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 24 02:55:39 crc kubenswrapper[4923]: I0224 02:55:39.703553 4923 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 02:55:39 crc kubenswrapper[4923]: I0224 02:55:39.703599 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 02:55:39 crc kubenswrapper[4923]: I0224 02:55:39.703655 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:55:39 crc kubenswrapper[4923]: I0224 02:55:39.703782 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:39 crc kubenswrapper[4923]: I0224 02:55:39.704679 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:39 crc kubenswrapper[4923]: I0224 02:55:39.704724 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:39 crc kubenswrapper[4923]: I0224 02:55:39.704734 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:39 crc kubenswrapper[4923]: I0224 02:55:39.705332 4923 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"d119205e5e39d6310003728f67f682a6da3eceac55283fd442da055497d26bb7"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 24 02:55:39 crc kubenswrapper[4923]: I0224 02:55:39.705439 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://d119205e5e39d6310003728f67f682a6da3eceac55283fd442da055497d26bb7" gracePeriod=30 Feb 24 02:55:40 crc kubenswrapper[4923]: I0224 02:55:40.009527 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 24 02:55:40 crc kubenswrapper[4923]: I0224 02:55:40.010649 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 24 02:55:40 crc kubenswrapper[4923]: I0224 02:55:40.010966 4923 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d119205e5e39d6310003728f67f682a6da3eceac55283fd442da055497d26bb7" exitCode=255 Feb 24 02:55:40 crc kubenswrapper[4923]: I0224 02:55:40.011003 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d119205e5e39d6310003728f67f682a6da3eceac55283fd442da055497d26bb7"} Feb 24 02:55:40 crc kubenswrapper[4923]: I0224 02:55:40.011037 4923 scope.go:117] "RemoveContainer" containerID="7d137bd6a0c3b9d5a8cca6e6322d446500abc1ae8a218f2572a7cb10b0258402" Feb 24 02:55:40 crc kubenswrapper[4923]: I0224 02:55:40.658406 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 12:16:17.302706932 +0000 UTC Feb 24 02:55:41 crc kubenswrapper[4923]: I0224 02:55:41.017033 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 24 02:55:41 crc kubenswrapper[4923]: I0224 02:55:41.018573 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c5e77a1986a76ca4084e827a16abd61475c150497baa25e68649bb810cd2d30f"} Feb 24 02:55:41 crc kubenswrapper[4923]: I0224 02:55:41.018733 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:41 crc kubenswrapper[4923]: I0224 02:55:41.020478 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:41 crc kubenswrapper[4923]: I0224 02:55:41.020551 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:41 crc kubenswrapper[4923]: I0224 02:55:41.020578 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:41 crc kubenswrapper[4923]: I0224 02:55:41.658739 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 21:39:35.919982077 +0000 UTC Feb 24 02:55:42 crc kubenswrapper[4923]: I0224 02:55:42.021374 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:42 crc kubenswrapper[4923]: I0224 02:55:42.022667 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:42 crc kubenswrapper[4923]: I0224 02:55:42.022728 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:42 crc kubenswrapper[4923]: I0224 02:55:42.022747 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:42 crc kubenswrapper[4923]: I0224 02:55:42.620598 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:55:42 crc kubenswrapper[4923]: I0224 02:55:42.659847 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 20:02:46.616247631 +0000 UTC Feb 24 02:55:43 crc kubenswrapper[4923]: I0224 02:55:43.023285 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:43 crc kubenswrapper[4923]: I0224 02:55:43.023993 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:43 crc kubenswrapper[4923]: I0224 02:55:43.024034 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:43 crc kubenswrapper[4923]: I0224 02:55:43.024042 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:43 crc kubenswrapper[4923]: I0224 02:55:43.660911 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 22:00:00.632797972 +0000 UTC Feb 24 02:55:44 crc kubenswrapper[4923]: I0224 02:55:44.293756 4923 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 24 02:55:44 crc kubenswrapper[4923]: I0224 02:55:44.308967 4923 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 24 02:55:44 crc kubenswrapper[4923]: I0224 02:55:44.661338 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 10:46:12.238736826 +0000 UTC Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.661878 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 12:46:22.01132467 +0000 UTC Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.848272 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.849408 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.849473 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.849495 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.849652 4923 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.858882 4923 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.859091 4923 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 24 02:55:45 crc kubenswrapper[4923]: E0224 02:55:45.859105 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.862509 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.862541 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.862574 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.862589 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.862602 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:55:45Z","lastTransitionTime":"2026-02-24T02:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:55:45 crc kubenswrapper[4923]: E0224 02:55:45.878446 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c1ad024-4141-4b85-9d41-81c58856d2b4\\\",\\\"systemUUID\\\":\\\"6ab8a4ca-6e04-4f42-b567-ee52d071b81a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.886686 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.886847 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.886998 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.887101 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.887180 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:55:45Z","lastTransitionTime":"2026-02-24T02:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:55:45 crc kubenswrapper[4923]: E0224 02:55:45.899743 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c1ad024-4141-4b85-9d41-81c58856d2b4\\\",\\\"systemUUID\\\":\\\"6ab8a4ca-6e04-4f42-b567-ee52d071b81a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.909795 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.909838 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.909855 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.909877 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.909894 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:55:45Z","lastTransitionTime":"2026-02-24T02:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:55:45 crc kubenswrapper[4923]: E0224 02:55:45.920912 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c1ad024-4141-4b85-9d41-81c58856d2b4\\\",\\\"systemUUID\\\":\\\"6ab8a4ca-6e04-4f42-b567-ee52d071b81a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.927615 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.927758 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.927833 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.927895 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:55:45 crc kubenswrapper[4923]: I0224 02:55:45.927961 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:55:45Z","lastTransitionTime":"2026-02-24T02:55:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:55:45 crc kubenswrapper[4923]: E0224 02:55:45.936923 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c1ad024-4141-4b85-9d41-81c58856d2b4\\\",\\\"systemUUID\\\":\\\"6ab8a4ca-6e04-4f42-b567-ee52d071b81a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:55:45 crc kubenswrapper[4923]: E0224 02:55:45.937027 4923 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 02:55:45 crc kubenswrapper[4923]: E0224 02:55:45.937162 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:46 crc kubenswrapper[4923]: E0224 02:55:46.037342 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:46 crc kubenswrapper[4923]: E0224 02:55:46.138082 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:46 crc kubenswrapper[4923]: E0224 02:55:46.238435 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:46 crc kubenswrapper[4923]: E0224 02:55:46.339127 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:46 crc kubenswrapper[4923]: E0224 02:55:46.440349 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:46 crc kubenswrapper[4923]: E0224 02:55:46.540608 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:46 crc kubenswrapper[4923]: E0224 02:55:46.640704 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:46 crc kubenswrapper[4923]: I0224 02:55:46.662431 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 19:10:12.661503713 +0000 UTC Feb 24 02:55:46 crc kubenswrapper[4923]: I0224 02:55:46.703061 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:55:46 crc kubenswrapper[4923]: I0224 02:55:46.703487 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:46 crc kubenswrapper[4923]: I0224 02:55:46.704747 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:46 crc kubenswrapper[4923]: I0224 02:55:46.704883 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:46 crc kubenswrapper[4923]: I0224 02:55:46.704977 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:46 crc kubenswrapper[4923]: E0224 02:55:46.740795 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:46 crc kubenswrapper[4923]: E0224 02:55:46.841888 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:46 crc kubenswrapper[4923]: E0224 02:55:46.942227 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:47 crc kubenswrapper[4923]: E0224 02:55:47.043164 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:47 crc kubenswrapper[4923]: E0224 02:55:47.143859 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:47 crc kubenswrapper[4923]: E0224 02:55:47.244590 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:47 crc kubenswrapper[4923]: E0224 02:55:47.344965 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:47 crc kubenswrapper[4923]: E0224 02:55:47.446070 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:47 crc kubenswrapper[4923]: E0224 02:55:47.546182 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:47 crc kubenswrapper[4923]: E0224 02:55:47.646956 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:47 crc kubenswrapper[4923]: I0224 02:55:47.663109 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 13:47:27.60659263 +0000 UTC Feb 24 02:55:47 crc kubenswrapper[4923]: E0224 02:55:47.747571 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:47 crc kubenswrapper[4923]: E0224 02:55:47.845388 4923 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 02:55:47 crc kubenswrapper[4923]: E0224 02:55:47.847640 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:47 crc kubenswrapper[4923]: E0224 02:55:47.948726 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:48 crc kubenswrapper[4923]: E0224 02:55:48.049111 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:48 crc kubenswrapper[4923]: E0224 02:55:48.149866 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:48 crc kubenswrapper[4923]: E0224 02:55:48.250828 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:48 crc kubenswrapper[4923]: E0224 02:55:48.351451 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:48 crc kubenswrapper[4923]: E0224 02:55:48.452021 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:48 crc kubenswrapper[4923]: E0224 02:55:48.552667 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:48 crc kubenswrapper[4923]: E0224 02:55:48.652879 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:48 crc kubenswrapper[4923]: I0224 02:55:48.664208 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 18:51:16.594310843 +0000 UTC Feb 24 02:55:48 crc kubenswrapper[4923]: E0224 02:55:48.753198 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:48 crc kubenswrapper[4923]: E0224 02:55:48.853909 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:48 crc kubenswrapper[4923]: E0224 02:55:48.955020 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:49 crc kubenswrapper[4923]: E0224 02:55:49.056100 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:49 crc kubenswrapper[4923]: E0224 02:55:49.157165 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:49 crc kubenswrapper[4923]: E0224 02:55:49.257948 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:49 crc kubenswrapper[4923]: E0224 02:55:49.359068 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:49 crc kubenswrapper[4923]: E0224 02:55:49.459505 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:49 crc kubenswrapper[4923]: E0224 02:55:49.560461 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:49 crc kubenswrapper[4923]: I0224 02:55:49.656029 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:55:49 crc kubenswrapper[4923]: I0224 02:55:49.656166 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:49 crc kubenswrapper[4923]: I0224 02:55:49.657136 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:49 crc kubenswrapper[4923]: I0224 02:55:49.657187 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:49 crc kubenswrapper[4923]: I0224 02:55:49.657200 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:49 crc kubenswrapper[4923]: E0224 02:55:49.660892 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:49 crc kubenswrapper[4923]: I0224 02:55:49.665056 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 19:57:11.694746222 +0000 UTC Feb 24 02:55:49 crc kubenswrapper[4923]: I0224 02:55:49.713128 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:49 crc kubenswrapper[4923]: I0224 02:55:49.714326 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:49 crc kubenswrapper[4923]: I0224 02:55:49.714365 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:49 crc kubenswrapper[4923]: I0224 02:55:49.714376 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:49 crc kubenswrapper[4923]: I0224 02:55:49.715220 4923 scope.go:117] "RemoveContainer" containerID="224098d1ce006fab873c1fa15bbccab0a02b2912364a665a23708454cf850c63" Feb 24 02:55:49 crc kubenswrapper[4923]: E0224 02:55:49.761670 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:49 crc kubenswrapper[4923]: E0224 02:55:49.862463 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:49 crc kubenswrapper[4923]: E0224 02:55:49.963119 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:50 crc kubenswrapper[4923]: I0224 02:55:50.043336 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 24 02:55:50 crc kubenswrapper[4923]: I0224 02:55:50.045627 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ff398e67eccf34b9e0bc1d34f2285b4a275efa099b3a0891887c9ca8e979d39c"} Feb 24 02:55:50 crc kubenswrapper[4923]: I0224 02:55:50.045794 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:50 crc kubenswrapper[4923]: I0224 02:55:50.047052 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:50 crc kubenswrapper[4923]: I0224 02:55:50.047089 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:50 crc kubenswrapper[4923]: I0224 02:55:50.047111 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:50 crc kubenswrapper[4923]: E0224 02:55:50.063515 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:50 crc kubenswrapper[4923]: E0224 02:55:50.164619 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:50 crc kubenswrapper[4923]: E0224 02:55:50.264779 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:50 crc kubenswrapper[4923]: E0224 02:55:50.365469 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:50 crc kubenswrapper[4923]: E0224 02:55:50.466250 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:50 crc kubenswrapper[4923]: E0224 02:55:50.567077 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:50 crc kubenswrapper[4923]: I0224 02:55:50.665671 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 12:24:58.896482605 +0000 UTC Feb 24 02:55:50 crc kubenswrapper[4923]: E0224 02:55:50.668179 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:50 crc kubenswrapper[4923]: E0224 02:55:50.768284 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:50 crc kubenswrapper[4923]: E0224 02:55:50.869372 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:50 crc kubenswrapper[4923]: E0224 02:55:50.969812 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:51 crc kubenswrapper[4923]: I0224 02:55:51.049800 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 24 02:55:51 crc kubenswrapper[4923]: I0224 02:55:51.050390 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 24 02:55:51 crc kubenswrapper[4923]: I0224 02:55:51.052136 4923 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ff398e67eccf34b9e0bc1d34f2285b4a275efa099b3a0891887c9ca8e979d39c" exitCode=255 Feb 24 02:55:51 crc kubenswrapper[4923]: I0224 02:55:51.052177 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ff398e67eccf34b9e0bc1d34f2285b4a275efa099b3a0891887c9ca8e979d39c"} Feb 24 02:55:51 crc kubenswrapper[4923]: I0224 02:55:51.052219 4923 scope.go:117] "RemoveContainer" containerID="224098d1ce006fab873c1fa15bbccab0a02b2912364a665a23708454cf850c63" Feb 24 02:55:51 crc kubenswrapper[4923]: I0224 02:55:51.052337 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:51 crc kubenswrapper[4923]: I0224 02:55:51.053175 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:51 crc kubenswrapper[4923]: I0224 02:55:51.053210 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:51 crc kubenswrapper[4923]: I0224 02:55:51.053220 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:51 crc kubenswrapper[4923]: I0224 02:55:51.053787 4923 scope.go:117] "RemoveContainer" containerID="ff398e67eccf34b9e0bc1d34f2285b4a275efa099b3a0891887c9ca8e979d39c" Feb 24 02:55:51 crc kubenswrapper[4923]: E0224 02:55:51.053996 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 02:55:51 crc kubenswrapper[4923]: E0224 02:55:51.070832 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:51 crc kubenswrapper[4923]: E0224 02:55:51.171565 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:51 crc kubenswrapper[4923]: E0224 02:55:51.272666 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:51 crc kubenswrapper[4923]: E0224 02:55:51.373351 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:51 crc kubenswrapper[4923]: E0224 02:55:51.474479 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:51 crc kubenswrapper[4923]: I0224 02:55:51.488654 4923 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 24 02:55:51 crc kubenswrapper[4923]: E0224 02:55:51.575340 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:51 crc kubenswrapper[4923]: I0224 02:55:51.666092 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 23:30:12.196585693 +0000 UTC Feb 24 02:55:51 crc kubenswrapper[4923]: E0224 02:55:51.675580 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:51 crc kubenswrapper[4923]: E0224 02:55:51.776600 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:51 crc kubenswrapper[4923]: E0224 02:55:51.877681 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:51 crc kubenswrapper[4923]: I0224 02:55:51.925894 4923 csr.go:261] certificate signing request csr-4k52x is approved, waiting to be issued Feb 24 02:55:51 crc kubenswrapper[4923]: I0224 02:55:51.937067 4923 csr.go:257] certificate signing request csr-4k52x is issued Feb 24 02:55:51 crc kubenswrapper[4923]: E0224 02:55:51.978145 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:52 crc kubenswrapper[4923]: I0224 02:55:52.056054 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 24 02:55:52 crc kubenswrapper[4923]: E0224 02:55:52.078429 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:52 crc kubenswrapper[4923]: E0224 02:55:52.179072 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:52 crc kubenswrapper[4923]: E0224 02:55:52.279596 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:52 crc kubenswrapper[4923]: E0224 02:55:52.380722 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:52 crc kubenswrapper[4923]: E0224 02:55:52.481813 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:52 crc kubenswrapper[4923]: E0224 02:55:52.582541 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:52 crc kubenswrapper[4923]: I0224 02:55:52.625178 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:55:52 crc kubenswrapper[4923]: I0224 02:55:52.625324 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:52 crc kubenswrapper[4923]: I0224 02:55:52.626548 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:52 crc kubenswrapper[4923]: I0224 02:55:52.626591 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:52 crc kubenswrapper[4923]: I0224 02:55:52.626604 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:52 crc kubenswrapper[4923]: I0224 02:55:52.666848 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 11:45:46.108215479 +0000 UTC Feb 24 02:55:52 crc kubenswrapper[4923]: E0224 02:55:52.683210 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:52 crc kubenswrapper[4923]: E0224 02:55:52.784009 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:52 crc kubenswrapper[4923]: E0224 02:55:52.884805 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:52 crc kubenswrapper[4923]: I0224 02:55:52.938281 4923 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 02:50:51 +0000 UTC, rotation deadline is 2027-01-06 04:17:10.418387999 +0000 UTC Feb 24 02:55:52 crc kubenswrapper[4923]: I0224 02:55:52.938326 4923 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7585h21m17.480064526s for next certificate rotation Feb 24 02:55:52 crc kubenswrapper[4923]: E0224 02:55:52.985668 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:53 crc kubenswrapper[4923]: E0224 02:55:53.086103 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:53 crc kubenswrapper[4923]: E0224 02:55:53.186495 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:53 crc kubenswrapper[4923]: E0224 02:55:53.287392 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:53 crc kubenswrapper[4923]: E0224 02:55:53.388323 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:53 crc kubenswrapper[4923]: E0224 02:55:53.489121 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:53 crc kubenswrapper[4923]: E0224 02:55:53.589888 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:53 crc kubenswrapper[4923]: I0224 02:55:53.667492 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 23:25:43.76175249 +0000 UTC Feb 24 02:55:53 crc kubenswrapper[4923]: E0224 02:55:53.690910 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:53 crc kubenswrapper[4923]: E0224 02:55:53.792063 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:53 crc kubenswrapper[4923]: E0224 02:55:53.893168 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:53 crc kubenswrapper[4923]: E0224 02:55:53.993944 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:54 crc kubenswrapper[4923]: E0224 02:55:54.094345 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:54 crc kubenswrapper[4923]: E0224 02:55:54.194887 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:54 crc kubenswrapper[4923]: E0224 02:55:54.295596 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:54 crc kubenswrapper[4923]: E0224 02:55:54.396420 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:54 crc kubenswrapper[4923]: E0224 02:55:54.497033 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:54 crc kubenswrapper[4923]: E0224 02:55:54.598115 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:54 crc kubenswrapper[4923]: I0224 02:55:54.667868 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 15:53:28.544965107 +0000 UTC Feb 24 02:55:54 crc kubenswrapper[4923]: E0224 02:55:54.699234 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:54 crc kubenswrapper[4923]: I0224 02:55:54.738971 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:55:54 crc kubenswrapper[4923]: I0224 02:55:54.739176 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:54 crc kubenswrapper[4923]: I0224 02:55:54.740269 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:54 crc kubenswrapper[4923]: I0224 02:55:54.740330 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:54 crc kubenswrapper[4923]: I0224 02:55:54.740342 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:54 crc kubenswrapper[4923]: I0224 02:55:54.741003 4923 scope.go:117] "RemoveContainer" containerID="ff398e67eccf34b9e0bc1d34f2285b4a275efa099b3a0891887c9ca8e979d39c" Feb 24 02:55:54 crc kubenswrapper[4923]: E0224 02:55:54.741172 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 02:55:54 crc kubenswrapper[4923]: E0224 02:55:54.799569 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:54 crc kubenswrapper[4923]: E0224 02:55:54.900672 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:55 crc kubenswrapper[4923]: E0224 02:55:55.001310 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:55 crc kubenswrapper[4923]: E0224 02:55:55.101584 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:55 crc kubenswrapper[4923]: E0224 02:55:55.202198 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:55 crc kubenswrapper[4923]: E0224 02:55:55.302644 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:55 crc kubenswrapper[4923]: E0224 02:55:55.403592 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:55 crc kubenswrapper[4923]: E0224 02:55:55.504541 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:55 crc kubenswrapper[4923]: I0224 02:55:55.529967 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:55:55 crc kubenswrapper[4923]: I0224 02:55:55.530124 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:55 crc kubenswrapper[4923]: I0224 02:55:55.531193 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:55 crc kubenswrapper[4923]: I0224 02:55:55.531227 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:55 crc kubenswrapper[4923]: I0224 02:55:55.531238 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:55 crc kubenswrapper[4923]: I0224 02:55:55.531971 4923 scope.go:117] "RemoveContainer" containerID="ff398e67eccf34b9e0bc1d34f2285b4a275efa099b3a0891887c9ca8e979d39c" Feb 24 02:55:55 crc kubenswrapper[4923]: E0224 02:55:55.532156 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 02:55:55 crc kubenswrapper[4923]: E0224 02:55:55.605366 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:55 crc kubenswrapper[4923]: I0224 02:55:55.668193 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 10:28:19.515039609 +0000 UTC Feb 24 02:55:55 crc kubenswrapper[4923]: E0224 02:55:55.705733 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:55 crc kubenswrapper[4923]: E0224 02:55:55.806468 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:55 crc kubenswrapper[4923]: E0224 02:55:55.907556 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:56 crc kubenswrapper[4923]: E0224 02:55:56.008092 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:56 crc kubenswrapper[4923]: E0224 02:55:56.109280 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:56 crc kubenswrapper[4923]: E0224 02:55:56.148719 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.152881 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.152930 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.152943 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.152962 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.152976 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:55:56Z","lastTransitionTime":"2026-02-24T02:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:55:56 crc kubenswrapper[4923]: E0224 02:55:56.168194 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c1ad024-4141-4b85-9d41-81c58856d2b4\\\",\\\"systemUUID\\\":\\\"6ab8a4ca-6e04-4f42-b567-ee52d071b81a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.172181 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.172248 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.172421 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.172448 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.172467 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:55:56Z","lastTransitionTime":"2026-02-24T02:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:55:56 crc kubenswrapper[4923]: E0224 02:55:56.184341 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c1ad024-4141-4b85-9d41-81c58856d2b4\\\",\\\"systemUUID\\\":\\\"6ab8a4ca-6e04-4f42-b567-ee52d071b81a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.188115 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.188142 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.188180 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.188228 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.188239 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:55:56Z","lastTransitionTime":"2026-02-24T02:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:55:56 crc kubenswrapper[4923]: E0224 02:55:56.202983 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c1ad024-4141-4b85-9d41-81c58856d2b4\\\",\\\"systemUUID\\\":\\\"6ab8a4ca-6e04-4f42-b567-ee52d071b81a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.206796 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.206820 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.206832 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.206845 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.206856 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:55:56Z","lastTransitionTime":"2026-02-24T02:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:55:56 crc kubenswrapper[4923]: E0224 02:55:56.221133 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:55:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c1ad024-4141-4b85-9d41-81c58856d2b4\\\",\\\"systemUUID\\\":\\\"6ab8a4ca-6e04-4f42-b567-ee52d071b81a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:55:56 crc kubenswrapper[4923]: E0224 02:55:56.221350 4923 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 02:55:56 crc kubenswrapper[4923]: E0224 02:55:56.221383 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:56 crc kubenswrapper[4923]: E0224 02:55:56.322258 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:56 crc kubenswrapper[4923]: E0224 02:55:56.423460 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:56 crc kubenswrapper[4923]: E0224 02:55:56.524409 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:56 crc kubenswrapper[4923]: E0224 02:55:56.624829 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:56 crc kubenswrapper[4923]: I0224 02:55:56.669003 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 06:05:38.935138134 +0000 UTC Feb 24 02:55:56 crc kubenswrapper[4923]: E0224 02:55:56.725243 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:56 crc kubenswrapper[4923]: E0224 02:55:56.826320 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:56 crc kubenswrapper[4923]: E0224 02:55:56.926523 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:57 crc kubenswrapper[4923]: E0224 02:55:57.026808 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:57 crc kubenswrapper[4923]: E0224 02:55:57.127204 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:57 crc kubenswrapper[4923]: E0224 02:55:57.228260 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:57 crc kubenswrapper[4923]: I0224 02:55:57.241589 4923 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 24 02:55:57 crc kubenswrapper[4923]: E0224 02:55:57.328914 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:57 crc kubenswrapper[4923]: E0224 02:55:57.429799 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:57 crc kubenswrapper[4923]: E0224 02:55:57.530712 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:57 crc kubenswrapper[4923]: E0224 02:55:57.631653 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:57 crc kubenswrapper[4923]: I0224 02:55:57.669880 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 20:17:28.386335765 +0000 UTC Feb 24 02:55:57 crc kubenswrapper[4923]: E0224 02:55:57.732743 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:57 crc kubenswrapper[4923]: E0224 02:55:57.833742 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:57 crc kubenswrapper[4923]: E0224 02:55:57.846056 4923 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 02:55:57 crc kubenswrapper[4923]: E0224 02:55:57.934652 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:58 crc kubenswrapper[4923]: E0224 02:55:58.034825 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:58 crc kubenswrapper[4923]: E0224 02:55:58.135233 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:58 crc kubenswrapper[4923]: E0224 02:55:58.236122 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:58 crc kubenswrapper[4923]: E0224 02:55:58.337176 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:58 crc kubenswrapper[4923]: E0224 02:55:58.437471 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:58 crc kubenswrapper[4923]: E0224 02:55:58.538288 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:58 crc kubenswrapper[4923]: E0224 02:55:58.638451 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:58 crc kubenswrapper[4923]: I0224 02:55:58.670706 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 23:02:02.604817242 +0000 UTC Feb 24 02:55:58 crc kubenswrapper[4923]: E0224 02:55:58.739575 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:58 crc kubenswrapper[4923]: E0224 02:55:58.840405 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:58 crc kubenswrapper[4923]: E0224 02:55:58.941237 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:59 crc kubenswrapper[4923]: E0224 02:55:59.042365 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:59 crc kubenswrapper[4923]: E0224 02:55:59.142510 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:59 crc kubenswrapper[4923]: E0224 02:55:59.243548 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:59 crc kubenswrapper[4923]: E0224 02:55:59.344149 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:59 crc kubenswrapper[4923]: E0224 02:55:59.444370 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:59 crc kubenswrapper[4923]: E0224 02:55:59.544982 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:59 crc kubenswrapper[4923]: E0224 02:55:59.645138 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:59 crc kubenswrapper[4923]: I0224 02:55:59.671461 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 09:16:08.210319546 +0000 UTC Feb 24 02:55:59 crc kubenswrapper[4923]: I0224 02:55:59.713099 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:55:59 crc kubenswrapper[4923]: I0224 02:55:59.717826 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:55:59 crc kubenswrapper[4923]: I0224 02:55:59.718048 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:55:59 crc kubenswrapper[4923]: I0224 02:55:59.718075 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:55:59 crc kubenswrapper[4923]: E0224 02:55:59.746344 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:59 crc kubenswrapper[4923]: E0224 02:55:59.847348 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:55:59 crc kubenswrapper[4923]: E0224 02:55:59.948026 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:00 crc kubenswrapper[4923]: E0224 02:56:00.048278 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:00 crc kubenswrapper[4923]: E0224 02:56:00.148679 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:00 crc kubenswrapper[4923]: E0224 02:56:00.248780 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:00 crc kubenswrapper[4923]: E0224 02:56:00.349165 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:00 crc kubenswrapper[4923]: E0224 02:56:00.449365 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:00 crc kubenswrapper[4923]: E0224 02:56:00.550134 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:00 crc kubenswrapper[4923]: E0224 02:56:00.650716 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:00 crc kubenswrapper[4923]: I0224 02:56:00.672454 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 22:32:41.721776746 +0000 UTC Feb 24 02:56:00 crc kubenswrapper[4923]: E0224 02:56:00.751393 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:00 crc kubenswrapper[4923]: E0224 02:56:00.852593 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:00 crc kubenswrapper[4923]: E0224 02:56:00.952980 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:01 crc kubenswrapper[4923]: E0224 02:56:01.054130 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:01 crc kubenswrapper[4923]: E0224 02:56:01.154698 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:01 crc kubenswrapper[4923]: E0224 02:56:01.255766 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:01 crc kubenswrapper[4923]: E0224 02:56:01.355902 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:01 crc kubenswrapper[4923]: E0224 02:56:01.457111 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:01 crc kubenswrapper[4923]: E0224 02:56:01.557215 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:01 crc kubenswrapper[4923]: E0224 02:56:01.658157 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:01 crc kubenswrapper[4923]: I0224 02:56:01.673515 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 10:53:29.3917896 +0000 UTC Feb 24 02:56:01 crc kubenswrapper[4923]: E0224 02:56:01.758715 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:01 crc kubenswrapper[4923]: E0224 02:56:01.859674 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:01 crc kubenswrapper[4923]: E0224 02:56:01.959928 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:02 crc kubenswrapper[4923]: E0224 02:56:02.060875 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:02 crc kubenswrapper[4923]: E0224 02:56:02.161560 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:02 crc kubenswrapper[4923]: E0224 02:56:02.262456 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:02 crc kubenswrapper[4923]: E0224 02:56:02.363614 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:02 crc kubenswrapper[4923]: E0224 02:56:02.506661 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:02 crc kubenswrapper[4923]: E0224 02:56:02.608143 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:02 crc kubenswrapper[4923]: I0224 02:56:02.674501 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 10:55:34.930702421 +0000 UTC Feb 24 02:56:02 crc kubenswrapper[4923]: E0224 02:56:02.708710 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:02 crc kubenswrapper[4923]: E0224 02:56:02.809387 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:02 crc kubenswrapper[4923]: E0224 02:56:02.910353 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:03 crc kubenswrapper[4923]: E0224 02:56:03.011178 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:03 crc kubenswrapper[4923]: E0224 02:56:03.111496 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:03 crc kubenswrapper[4923]: E0224 02:56:03.212241 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:03 crc kubenswrapper[4923]: E0224 02:56:03.313235 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:03 crc kubenswrapper[4923]: E0224 02:56:03.414083 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:03 crc kubenswrapper[4923]: E0224 02:56:03.514722 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:03 crc kubenswrapper[4923]: E0224 02:56:03.614903 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:03 crc kubenswrapper[4923]: I0224 02:56:03.675615 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 07:45:37.565243449 +0000 UTC Feb 24 02:56:03 crc kubenswrapper[4923]: E0224 02:56:03.715583 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:03 crc kubenswrapper[4923]: E0224 02:56:03.815669 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:03 crc kubenswrapper[4923]: E0224 02:56:03.916810 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:04 crc kubenswrapper[4923]: E0224 02:56:04.017669 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:04 crc kubenswrapper[4923]: E0224 02:56:04.118801 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:04 crc kubenswrapper[4923]: E0224 02:56:04.219545 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:04 crc kubenswrapper[4923]: E0224 02:56:04.319913 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:04 crc kubenswrapper[4923]: E0224 02:56:04.420144 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:04 crc kubenswrapper[4923]: E0224 02:56:04.521187 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:04 crc kubenswrapper[4923]: E0224 02:56:04.621909 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:04 crc kubenswrapper[4923]: I0224 02:56:04.676491 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 09:01:24.099313124 +0000 UTC Feb 24 02:56:04 crc kubenswrapper[4923]: E0224 02:56:04.722140 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:04 crc kubenswrapper[4923]: E0224 02:56:04.822406 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:04 crc kubenswrapper[4923]: E0224 02:56:04.922953 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:05 crc kubenswrapper[4923]: E0224 02:56:05.023982 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:05 crc kubenswrapper[4923]: E0224 02:56:05.124594 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:05 crc kubenswrapper[4923]: E0224 02:56:05.225311 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:05 crc kubenswrapper[4923]: E0224 02:56:05.326242 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:05 crc kubenswrapper[4923]: E0224 02:56:05.427328 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:05 crc kubenswrapper[4923]: E0224 02:56:05.528375 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:05 crc kubenswrapper[4923]: E0224 02:56:05.629213 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:05 crc kubenswrapper[4923]: I0224 02:56:05.676739 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 10:56:58.472399651 +0000 UTC Feb 24 02:56:05 crc kubenswrapper[4923]: E0224 02:56:05.729604 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:05 crc kubenswrapper[4923]: E0224 02:56:05.830754 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:05 crc kubenswrapper[4923]: E0224 02:56:05.931506 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:06 crc kubenswrapper[4923]: E0224 02:56:06.032193 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:06 crc kubenswrapper[4923]: E0224 02:56:06.133017 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:06 crc kubenswrapper[4923]: E0224 02:56:06.233848 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:06 crc kubenswrapper[4923]: E0224 02:56:06.334988 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:06 crc kubenswrapper[4923]: E0224 02:56:06.403863 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.408785 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.408834 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.408851 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.408893 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.408912 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:06Z","lastTransitionTime":"2026-02-24T02:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:06 crc kubenswrapper[4923]: E0224 02:56:06.424503 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c1ad024-4141-4b85-9d41-81c58856d2b4\\\",\\\"systemUUID\\\":\\\"6ab8a4ca-6e04-4f42-b567-ee52d071b81a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.429993 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.430039 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.430053 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.430076 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.430090 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:06Z","lastTransitionTime":"2026-02-24T02:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:06 crc kubenswrapper[4923]: E0224 02:56:06.444944 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c1ad024-4141-4b85-9d41-81c58856d2b4\\\",\\\"systemUUID\\\":\\\"6ab8a4ca-6e04-4f42-b567-ee52d071b81a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.449714 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.449769 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.449788 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.449815 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.449834 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:06Z","lastTransitionTime":"2026-02-24T02:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:06 crc kubenswrapper[4923]: E0224 02:56:06.469398 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c1ad024-4141-4b85-9d41-81c58856d2b4\\\",\\\"systemUUID\\\":\\\"6ab8a4ca-6e04-4f42-b567-ee52d071b81a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.474987 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.475072 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.475085 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.475105 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.475120 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:06Z","lastTransitionTime":"2026-02-24T02:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:06 crc kubenswrapper[4923]: E0224 02:56:06.489944 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c1ad024-4141-4b85-9d41-81c58856d2b4\\\",\\\"systemUUID\\\":\\\"6ab8a4ca-6e04-4f42-b567-ee52d071b81a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:56:06 crc kubenswrapper[4923]: E0224 02:56:06.490172 4923 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 02:56:06 crc kubenswrapper[4923]: E0224 02:56:06.490213 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:06 crc kubenswrapper[4923]: E0224 02:56:06.590463 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:06 crc kubenswrapper[4923]: I0224 02:56:06.677867 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 23:58:44.570370481 +0000 UTC Feb 24 02:56:06 crc kubenswrapper[4923]: E0224 02:56:06.691175 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:06 crc kubenswrapper[4923]: E0224 02:56:06.791637 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:06 crc kubenswrapper[4923]: E0224 02:56:06.891903 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:06 crc kubenswrapper[4923]: E0224 02:56:06.992399 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:07 crc kubenswrapper[4923]: E0224 02:56:07.092870 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:07 crc kubenswrapper[4923]: E0224 02:56:07.193029 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:07 crc kubenswrapper[4923]: E0224 02:56:07.294145 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:07 crc kubenswrapper[4923]: E0224 02:56:07.395035 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:07 crc kubenswrapper[4923]: I0224 02:56:07.491666 4923 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 24 02:56:07 crc kubenswrapper[4923]: E0224 02:56:07.495534 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:07 crc kubenswrapper[4923]: E0224 02:56:07.596238 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:07 crc kubenswrapper[4923]: I0224 02:56:07.678888 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 13:51:01.707456527 +0000 UTC Feb 24 02:56:07 crc kubenswrapper[4923]: E0224 02:56:07.697068 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:07 crc kubenswrapper[4923]: E0224 02:56:07.798167 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:07 crc kubenswrapper[4923]: E0224 02:56:07.846680 4923 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 02:56:07 crc kubenswrapper[4923]: E0224 02:56:07.898270 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:07 crc kubenswrapper[4923]: E0224 02:56:07.999227 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:08 crc kubenswrapper[4923]: E0224 02:56:08.100480 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:08 crc kubenswrapper[4923]: E0224 02:56:08.200902 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:08 crc kubenswrapper[4923]: E0224 02:56:08.301823 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:08 crc kubenswrapper[4923]: E0224 02:56:08.402868 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:08 crc kubenswrapper[4923]: E0224 02:56:08.503208 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:08 crc kubenswrapper[4923]: E0224 02:56:08.603854 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:08 crc kubenswrapper[4923]: I0224 02:56:08.680050 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 09:51:14.235376457 +0000 UTC Feb 24 02:56:08 crc kubenswrapper[4923]: E0224 02:56:08.704307 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:08 crc kubenswrapper[4923]: E0224 02:56:08.804661 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:08 crc kubenswrapper[4923]: E0224 02:56:08.905458 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:09 crc kubenswrapper[4923]: E0224 02:56:09.005940 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:09 crc kubenswrapper[4923]: E0224 02:56:09.106172 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:09 crc kubenswrapper[4923]: E0224 02:56:09.206816 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:09 crc kubenswrapper[4923]: E0224 02:56:09.307833 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:09 crc kubenswrapper[4923]: E0224 02:56:09.408059 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:09 crc kubenswrapper[4923]: E0224 02:56:09.508995 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:09 crc kubenswrapper[4923]: E0224 02:56:09.609848 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:09 crc kubenswrapper[4923]: I0224 02:56:09.680471 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 00:07:32.250516113 +0000 UTC Feb 24 02:56:09 crc kubenswrapper[4923]: E0224 02:56:09.710865 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:09 crc kubenswrapper[4923]: I0224 02:56:09.712114 4923 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 02:56:09 crc kubenswrapper[4923]: I0224 02:56:09.713078 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:09 crc kubenswrapper[4923]: I0224 02:56:09.713106 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:09 crc kubenswrapper[4923]: I0224 02:56:09.713114 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:09 crc kubenswrapper[4923]: I0224 02:56:09.713657 4923 scope.go:117] "RemoveContainer" containerID="ff398e67eccf34b9e0bc1d34f2285b4a275efa099b3a0891887c9ca8e979d39c" Feb 24 02:56:09 crc kubenswrapper[4923]: E0224 02:56:09.713807 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 02:56:09 crc kubenswrapper[4923]: E0224 02:56:09.811346 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:09 crc kubenswrapper[4923]: E0224 02:56:09.911747 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:10 crc kubenswrapper[4923]: E0224 02:56:10.012637 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:10 crc kubenswrapper[4923]: E0224 02:56:10.113249 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:10 crc kubenswrapper[4923]: E0224 02:56:10.214150 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:10 crc kubenswrapper[4923]: E0224 02:56:10.315122 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:10 crc kubenswrapper[4923]: E0224 02:56:10.415263 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:10 crc kubenswrapper[4923]: E0224 02:56:10.516164 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:10 crc kubenswrapper[4923]: E0224 02:56:10.616611 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:10 crc kubenswrapper[4923]: I0224 02:56:10.681467 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 08:14:13.43650173 +0000 UTC Feb 24 02:56:10 crc kubenswrapper[4923]: E0224 02:56:10.717333 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:10 crc kubenswrapper[4923]: E0224 02:56:10.817748 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:10 crc kubenswrapper[4923]: E0224 02:56:10.918135 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:11 crc kubenswrapper[4923]: E0224 02:56:11.019091 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:11 crc kubenswrapper[4923]: E0224 02:56:11.119795 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:11 crc kubenswrapper[4923]: E0224 02:56:11.220612 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:11 crc kubenswrapper[4923]: E0224 02:56:11.321375 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:11 crc kubenswrapper[4923]: E0224 02:56:11.422487 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:11 crc kubenswrapper[4923]: E0224 02:56:11.523172 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:11 crc kubenswrapper[4923]: E0224 02:56:11.623832 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:11 crc kubenswrapper[4923]: I0224 02:56:11.682226 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 19:57:57.636380552 +0000 UTC Feb 24 02:56:11 crc kubenswrapper[4923]: E0224 02:56:11.724426 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:11 crc kubenswrapper[4923]: E0224 02:56:11.824741 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:11 crc kubenswrapper[4923]: E0224 02:56:11.925139 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:12 crc kubenswrapper[4923]: E0224 02:56:12.026374 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:12 crc kubenswrapper[4923]: E0224 02:56:12.126989 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:12 crc kubenswrapper[4923]: I0224 02:56:12.148452 4923 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 24 02:56:12 crc kubenswrapper[4923]: E0224 02:56:12.227127 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:12 crc kubenswrapper[4923]: E0224 02:56:12.328230 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:12 crc kubenswrapper[4923]: E0224 02:56:12.428583 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:12 crc kubenswrapper[4923]: E0224 02:56:12.529360 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:12 crc kubenswrapper[4923]: E0224 02:56:12.629822 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:12 crc kubenswrapper[4923]: I0224 02:56:12.683106 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 08:37:18.09198845 +0000 UTC Feb 24 02:56:12 crc kubenswrapper[4923]: E0224 02:56:12.730284 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:12 crc kubenswrapper[4923]: E0224 02:56:12.830654 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:12 crc kubenswrapper[4923]: E0224 02:56:12.931345 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:13 crc kubenswrapper[4923]: E0224 02:56:13.031486 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:13 crc kubenswrapper[4923]: E0224 02:56:13.132127 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:13 crc kubenswrapper[4923]: E0224 02:56:13.232572 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:13 crc kubenswrapper[4923]: E0224 02:56:13.332740 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:13 crc kubenswrapper[4923]: E0224 02:56:13.433675 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:13 crc kubenswrapper[4923]: E0224 02:56:13.534645 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:13 crc kubenswrapper[4923]: E0224 02:56:13.635674 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:13 crc kubenswrapper[4923]: I0224 02:56:13.683911 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 11:32:52.911451314 +0000 UTC Feb 24 02:56:13 crc kubenswrapper[4923]: E0224 02:56:13.736362 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:13 crc kubenswrapper[4923]: E0224 02:56:13.837213 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:13 crc kubenswrapper[4923]: E0224 02:56:13.937754 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:14 crc kubenswrapper[4923]: E0224 02:56:14.038349 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:14 crc kubenswrapper[4923]: E0224 02:56:14.138743 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:14 crc kubenswrapper[4923]: E0224 02:56:14.239535 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:14 crc kubenswrapper[4923]: E0224 02:56:14.340745 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:14 crc kubenswrapper[4923]: E0224 02:56:14.441690 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:14 crc kubenswrapper[4923]: E0224 02:56:14.542407 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:14 crc kubenswrapper[4923]: E0224 02:56:14.643337 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:14 crc kubenswrapper[4923]: I0224 02:56:14.684631 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 03:38:24.414770514 +0000 UTC Feb 24 02:56:14 crc kubenswrapper[4923]: E0224 02:56:14.744226 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:14 crc kubenswrapper[4923]: E0224 02:56:14.845132 4923 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 02:56:14 crc kubenswrapper[4923]: I0224 02:56:14.882476 4923 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 24 02:56:14 crc kubenswrapper[4923]: I0224 02:56:14.948017 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:14 crc kubenswrapper[4923]: I0224 02:56:14.948075 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:14 crc kubenswrapper[4923]: I0224 02:56:14.948094 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:14 crc kubenswrapper[4923]: I0224 02:56:14.948118 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:14 crc kubenswrapper[4923]: I0224 02:56:14.948136 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:14Z","lastTransitionTime":"2026-02-24T02:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.050661 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.050736 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.050760 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.050790 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.050828 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:15Z","lastTransitionTime":"2026-02-24T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.153985 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.154036 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.154053 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.154076 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.154092 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:15Z","lastTransitionTime":"2026-02-24T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.257031 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.257087 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.257100 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.257119 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.257131 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:15Z","lastTransitionTime":"2026-02-24T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.360250 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.360283 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.360293 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.360328 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.360338 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:15Z","lastTransitionTime":"2026-02-24T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.462612 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.462742 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.462766 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.462790 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.462807 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:15Z","lastTransitionTime":"2026-02-24T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.566148 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.566197 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.566209 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.566227 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.566239 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:15Z","lastTransitionTime":"2026-02-24T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.628272 4923 apiserver.go:52] "Watching apiserver" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.633070 4923 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.633279 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.633608 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.633813 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.633905 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:15 crc kubenswrapper[4923]: E0224 02:56:15.633884 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 02:56:15 crc kubenswrapper[4923]: E0224 02:56:15.633945 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.634136 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:15 crc kubenswrapper[4923]: E0224 02:56:15.634179 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.634242 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.634327 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.635740 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.637222 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.637588 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.637764 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.638541 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.638603 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.638748 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.641596 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.643766 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.659661 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.668936 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.668972 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.668987 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.669004 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.669017 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:15Z","lastTransitionTime":"2026-02-24T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.672293 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.684189 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.685449 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 23:34:09.486941428 +0000 UTC Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.694874 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.708723 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.714279 4923 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.724529 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.724595 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.724644 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.724677 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.724707 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.724735 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.724763 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.724796 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.724824 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.724853 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.724882 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.724912 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.724942 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.724985 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725028 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725073 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725119 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725165 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725253 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725338 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725388 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725434 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725480 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725527 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725572 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725538 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725614 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725635 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725659 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725689 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725705 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725749 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725795 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725839 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725883 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725934 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725978 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726022 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726069 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726115 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726183 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726233 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726265 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726335 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726378 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726379 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726411 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726579 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726607 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726635 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726659 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726680 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726703 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726735 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726758 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726781 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726803 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726825 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726847 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726871 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726894 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726916 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726938 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726960 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726984 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727009 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727031 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727054 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727077 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727101 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727123 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727148 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727172 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727197 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727221 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727249 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727271 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727293 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727334 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727356 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727381 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727402 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727424 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727464 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727505 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727544 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727567 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727599 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727634 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727666 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727697 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727721 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727746 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727769 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727795 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727818 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727846 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727870 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727893 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727916 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727942 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727966 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727988 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728013 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728041 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728079 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728104 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728127 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728151 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728174 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728196 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728218 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728243 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728265 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728288 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728348 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728372 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728394 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728417 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728450 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728486 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728538 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728570 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728600 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728627 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728657 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728689 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728720 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728755 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728789 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728828 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728860 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728894 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728929 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729064 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729108 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729150 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729189 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729263 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729324 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729361 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729396 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729431 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729473 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729513 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729549 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729584 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729620 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729669 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729707 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729740 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729780 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729820 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729858 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729892 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729928 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729965 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730002 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730040 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730090 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730132 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730168 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730204 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730239 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730274 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730336 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730376 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730411 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730445 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730477 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730512 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730550 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730585 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730624 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730660 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730698 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730743 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730779 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730813 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730848 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730882 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730916 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730948 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730979 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731014 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731047 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731082 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731115 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731152 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731195 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731229 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731271 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731339 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731381 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731415 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731455 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731496 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731534 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731576 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731644 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731693 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731742 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731790 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731834 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731881 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731930 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731974 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.732012 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.732052 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.732127 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.732169 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.732212 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.732252 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.732351 4923 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.732379 4923 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.732394 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.725842 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726134 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726384 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726577 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.726938 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727081 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727281 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.727437 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728539 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728695 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728751 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728811 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728942 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.728979 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729143 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729488 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729419 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.729755 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730137 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730252 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730289 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730670 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.734794 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730716 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730978 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.730971 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731134 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731744 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.731829 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.732088 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.732207 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.732474 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.732580 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.732680 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.732699 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.733058 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.733262 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.733283 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.733427 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.733731 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.733855 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.733875 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.733896 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.733985 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.734100 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.734148 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.734253 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.734487 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.734825 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.735120 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.735219 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.736560 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.736444 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.736827 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.737039 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.737333 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.737370 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.737514 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.737794 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.737843 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.737877 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.738168 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.739280 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.739551 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.739587 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.739727 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.739950 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.739997 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.740590 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.740630 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.740737 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.740745 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.740987 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.741429 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.741947 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.741981 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.742422 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.742482 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.742504 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.742552 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.742825 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.742909 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.743034 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.743515 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.743626 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.743986 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.744643 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.744796 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.745297 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.745320 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.745561 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.745566 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.745830 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.745847 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.745889 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.745892 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: E0224 02:56:15.746001 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:16.245970108 +0000 UTC m=+100.263041031 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.746088 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.746196 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.746214 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.746337 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.746511 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.746544 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.746711 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.746777 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.746846 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.746981 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.748870 4923 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.752244 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.752460 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.752563 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.752590 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.752796 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: E0224 02:56:15.752808 4923 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 02:56:15 crc kubenswrapper[4923]: E0224 02:56:15.752949 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:16.252909951 +0000 UTC m=+100.269980944 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.753043 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.752907 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.753376 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.753466 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.753555 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.753610 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.753433 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.753903 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.753963 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.754057 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.754938 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.755046 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.755508 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.755553 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.755594 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.755772 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.755869 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.756095 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.755900 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.756473 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.756977 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.757132 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.757400 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.757595 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.757643 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.757945 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: E0224 02:56:15.758065 4923 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.752818 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: E0224 02:56:15.758163 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:16.258137299 +0000 UTC m=+100.275208262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.758243 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.759886 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.760514 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.761116 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.761554 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.761962 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.766406 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.766579 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.770344 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 02:56:15 crc kubenswrapper[4923]: E0224 02:56:15.771053 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 02:56:15 crc kubenswrapper[4923]: E0224 02:56:15.771085 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 02:56:15 crc kubenswrapper[4923]: E0224 02:56:15.771104 4923 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:15 crc kubenswrapper[4923]: E0224 02:56:15.771239 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:16.271217093 +0000 UTC m=+100.288287996 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.771370 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.772037 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.772078 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.772093 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.772113 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.772125 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:15Z","lastTransitionTime":"2026-02-24T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.775103 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 02:56:15 crc kubenswrapper[4923]: E0224 02:56:15.775314 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 02:56:15 crc kubenswrapper[4923]: E0224 02:56:15.775449 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 02:56:15 crc kubenswrapper[4923]: E0224 02:56:15.775550 4923 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:15 crc kubenswrapper[4923]: E0224 02:56:15.775699 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:16.27566745 +0000 UTC m=+100.292738263 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.775734 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.775864 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.777184 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.777617 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.778035 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.778676 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.779006 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.779008 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.779184 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.779265 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.779312 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.779377 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.779428 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.779847 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.779903 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.780053 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.780069 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.780110 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.780117 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.780318 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.780394 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.780555 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.780928 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.780943 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.780967 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.780961 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.781000 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.781190 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.781277 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.781539 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.782177 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.782255 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.782457 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.782607 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.782746 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.782773 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.782803 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.783037 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.783091 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.785415 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.785872 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.785962 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.786220 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.786434 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.786650 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.786744 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.786784 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.786869 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.789838 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.799446 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.803980 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.805634 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.816635 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.832886 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.832966 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.833079 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.833183 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.833392 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.833459 4923 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.833531 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.833584 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.833639 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.833848 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.833907 4923 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.833963 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.834015 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.834069 4923 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.834204 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.834285 4923 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.834367 4923 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.834419 4923 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.834479 4923 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.834534 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.834600 4923 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.834652 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.834702 4923 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.834751 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.834807 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.834868 4923 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.834919 4923 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.834971 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.835021 4923 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.835069 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.835122 4923 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.835178 4923 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.835230 4923 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.835279 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.835371 4923 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.835428 4923 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.835484 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.835594 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.835650 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.835699 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.835750 4923 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.835808 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.835869 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.835925 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.835980 4923 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.836039 4923 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.836094 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.836144 4923 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.836203 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.836253 4923 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.836333 4923 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.836386 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.836452 4923 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.836510 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.836565 4923 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.836614 4923 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.836662 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.836711 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.836758 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.836812 4923 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.836866 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.836918 4923 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.836971 4923 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.837020 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.837073 4923 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.837134 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.837188 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.837243 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.837342 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.837489 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.837552 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.837617 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.837685 4923 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.837740 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.837790 4923 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.837857 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.837919 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.837970 4923 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.838031 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.838091 4923 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.838145 4923 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.838198 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.838254 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.838326 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.838388 4923 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.838443 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.838496 4923 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.838545 4923 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.838600 4923 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.838653 4923 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.838706 4923 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.838762 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.838820 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.838873 4923 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.838937 4923 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.838992 4923 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.839042 4923 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.839102 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.839155 4923 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.839214 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.839271 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.839354 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.839418 4923 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.839483 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.839542 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.839608 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.839666 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.839726 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.839783 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.839848 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.839933 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.839995 4923 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.840064 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.840135 4923 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.840203 4923 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.840265 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.840343 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.840398 4923 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.840453 4923 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.840513 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.840563 4923 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.840630 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.840692 4923 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.840745 4923 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.840794 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.840849 4923 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.840900 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.840957 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.841012 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.841067 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842151 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842173 4923 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842191 4923 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842210 4923 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842229 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842247 4923 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842284 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842336 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842356 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842376 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842397 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842435 4923 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842465 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842501 4923 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842523 4923 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842548 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842642 4923 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842667 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842684 4923 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842702 4923 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842734 4923 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842756 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842775 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842793 4923 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842818 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842836 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842877 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842897 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842915 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842933 4923 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842957 4923 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842974 4923 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.842991 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843008 4923 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843024 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843042 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843057 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843084 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843123 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843152 4923 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843178 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843203 4923 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843240 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843265 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843290 4923 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843367 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843391 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843409 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843427 4923 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843450 4923 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843474 4923 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843511 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843536 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843560 4923 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843583 4923 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843599 4923 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843626 4923 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843644 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843663 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843688 4923 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843717 4923 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843743 4923 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843765 4923 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843783 4923 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.843800 4923 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.875735 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.875795 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.875861 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.875887 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.875950 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:15Z","lastTransitionTime":"2026-02-24T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.963075 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.976727 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.983514 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.984042 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.984187 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.984209 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.984233 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:15 crc kubenswrapper[4923]: I0224 02:56:15.984253 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:15Z","lastTransitionTime":"2026-02-24T02:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:16 crc kubenswrapper[4923]: W0224 02:56:16.007372 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-3db66a1a0eba861a92bef5fd7d0617982734070b216342fe8927a8b116c84c2f WatchSource:0}: Error finding container 3db66a1a0eba861a92bef5fd7d0617982734070b216342fe8927a8b116c84c2f: Status 404 returned error can't find the container with id 3db66a1a0eba861a92bef5fd7d0617982734070b216342fe8927a8b116c84c2f Feb 24 02:56:16 crc kubenswrapper[4923]: W0224 02:56:16.010181 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-0ef5e52ebf33cea2d9feec15957d57e9ad2640c22092d619b5c4b1a2784bc2c4 WatchSource:0}: Error finding container 0ef5e52ebf33cea2d9feec15957d57e9ad2640c22092d619b5c4b1a2784bc2c4: Status 404 returned error can't find the container with id 0ef5e52ebf33cea2d9feec15957d57e9ad2640c22092d619b5c4b1a2784bc2c4 Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.086996 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.087062 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.087083 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.087110 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.087132 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:16Z","lastTransitionTime":"2026-02-24T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.126276 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b009cd7229a6cde18b24c392961ff69b467557e748844a8c9eb16734597daf8d"} Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.127704 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0ef5e52ebf33cea2d9feec15957d57e9ad2640c22092d619b5c4b1a2784bc2c4"} Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.129515 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3db66a1a0eba861a92bef5fd7d0617982734070b216342fe8927a8b116c84c2f"} Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.190067 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.190110 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.190120 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.190135 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.190144 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:16Z","lastTransitionTime":"2026-02-24T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.247717 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.247871 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:17.247837692 +0000 UTC m=+101.264908525 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.292402 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.292437 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.292446 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.292460 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.292470 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:16Z","lastTransitionTime":"2026-02-24T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.349047 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.349104 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.349130 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.349157 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.349382 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.349406 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.349421 4923 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.349484 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:17.34946394 +0000 UTC m=+101.366534763 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.349745 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.349786 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.349806 4923 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.349868 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:17.34984949 +0000 UTC m=+101.366920343 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.349887 4923 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.349937 4923 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.349989 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:17.349972343 +0000 UTC m=+101.367043196 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.350016 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:17.350002324 +0000 UTC m=+101.367073177 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.395285 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.395370 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.395387 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.395409 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.395425 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:16Z","lastTransitionTime":"2026-02-24T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.498652 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.498761 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.498784 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.498815 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.498836 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:16Z","lastTransitionTime":"2026-02-24T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.526698 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.526749 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.526765 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.526785 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.526798 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:16Z","lastTransitionTime":"2026-02-24T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.542042 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c1ad024-4141-4b85-9d41-81c58856d2b4\\\",\\\"systemUUID\\\":\\\"6ab8a4ca-6e04-4f42-b567-ee52d071b81a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.547516 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.547566 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.547579 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.547596 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.547608 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:16Z","lastTransitionTime":"2026-02-24T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.565178 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c1ad024-4141-4b85-9d41-81c58856d2b4\\\",\\\"systemUUID\\\":\\\"6ab8a4ca-6e04-4f42-b567-ee52d071b81a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.569832 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.569871 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.569889 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.569910 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.569923 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:16Z","lastTransitionTime":"2026-02-24T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.580752 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c1ad024-4141-4b85-9d41-81c58856d2b4\\\",\\\"systemUUID\\\":\\\"6ab8a4ca-6e04-4f42-b567-ee52d071b81a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.585543 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.585615 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.585635 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.585692 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.585710 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:16Z","lastTransitionTime":"2026-02-24T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.598050 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c1ad024-4141-4b85-9d41-81c58856d2b4\\\",\\\"systemUUID\\\":\\\"6ab8a4ca-6e04-4f42-b567-ee52d071b81a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.604505 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.604621 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.604641 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.604673 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.604693 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:16Z","lastTransitionTime":"2026-02-24T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.619812 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5c1ad024-4141-4b85-9d41-81c58856d2b4\\\",\\\"systemUUID\\\":\\\"6ab8a4ca-6e04-4f42-b567-ee52d071b81a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.619977 4923 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.622146 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.622185 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.622201 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.622224 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.622240 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:16Z","lastTransitionTime":"2026-02-24T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.685858 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 13:55:14.072124423 +0000 UTC Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.712680 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:16 crc kubenswrapper[4923]: E0224 02:56:16.712973 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.725530 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.725639 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.725684 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.725765 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.725794 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:16Z","lastTransitionTime":"2026-02-24T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.829195 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.829271 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.829288 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.829337 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.829351 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:16Z","lastTransitionTime":"2026-02-24T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.932016 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.932086 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.932107 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.932136 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:16 crc kubenswrapper[4923]: I0224 02:56:16.932159 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:16Z","lastTransitionTime":"2026-02-24T02:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.035998 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.036080 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.036103 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.036131 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.036153 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:17Z","lastTransitionTime":"2026-02-24T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.134137 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bd5d072222355061b24a981452a3d7c1c5a5caef8b6bf491a5cdac138c7b09bc"} Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.136944 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1e05ac7e421335d34fa358b273a4bb5fce00f0d9e4c6ea8f7c5b42cd4bd2f6f5"} Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.137009 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4d17eb2b89ff5bd37d4215666c63f8a192fcd4d112ca34ba8aadc06a2db88080"} Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.138423 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.138459 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.138471 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.138489 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.138505 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:17Z","lastTransitionTime":"2026-02-24T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.149198 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:56:17Z is after 2025-08-24T17:21:41Z" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.161005 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:56:17Z is after 2025-08-24T17:21:41Z" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.171558 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:56:17Z is after 2025-08-24T17:21:41Z" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.188463 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd5d072222355061b24a981452a3d7c1c5a5caef8b6bf491a5cdac138c7b09bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T02:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:56:17Z is after 2025-08-24T17:21:41Z" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.207148 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:56:17Z is after 2025-08-24T17:21:41Z" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.220903 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:56:17Z is after 2025-08-24T17:21:41Z" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.238386 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd5d072222355061b24a981452a3d7c1c5a5caef8b6bf491a5cdac138c7b09bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T02:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:56:17Z is after 2025-08-24T17:21:41Z" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.241007 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.241055 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.241067 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.241084 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.241522 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:17Z","lastTransitionTime":"2026-02-24T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.253349 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:56:17Z is after 2025-08-24T17:21:41Z" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.258968 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:17 crc kubenswrapper[4923]: E0224 02:56:17.259184 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:19.259156519 +0000 UTC m=+103.276227502 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.268697 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e05ac7e421335d34fa358b273a4bb5fce00f0d9e4c6ea8f7c5b42cd4bd2f6f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T02:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d17eb2b89ff5bd37d4215666c63f8a192fcd4d112ca34ba8aadc06a2db88080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T02:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:56:17Z is after 2025-08-24T17:21:41Z" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.281271 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:56:17Z is after 2025-08-24T17:21:41Z" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.294685 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:56:17Z is after 2025-08-24T17:21:41Z" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.309942 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:56:17Z is after 2025-08-24T17:21:41Z" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.344437 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.344519 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.344531 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.344549 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.344921 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:17Z","lastTransitionTime":"2026-02-24T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.359895 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.359964 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.359999 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.360029 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:17 crc kubenswrapper[4923]: E0224 02:56:17.360177 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 02:56:17 crc kubenswrapper[4923]: E0224 02:56:17.360177 4923 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 02:56:17 crc kubenswrapper[4923]: E0224 02:56:17.360335 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 02:56:17 crc kubenswrapper[4923]: E0224 02:56:17.360358 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 02:56:17 crc kubenswrapper[4923]: E0224 02:56:17.360375 4923 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:17 crc kubenswrapper[4923]: E0224 02:56:17.360375 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:19.360341386 +0000 UTC m=+103.377412229 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 02:56:17 crc kubenswrapper[4923]: E0224 02:56:17.360439 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:19.360418708 +0000 UTC m=+103.377489541 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:17 crc kubenswrapper[4923]: E0224 02:56:17.360497 4923 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 02:56:17 crc kubenswrapper[4923]: E0224 02:56:17.360536 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:19.36052385 +0000 UTC m=+103.377594733 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 02:56:17 crc kubenswrapper[4923]: E0224 02:56:17.360201 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 02:56:17 crc kubenswrapper[4923]: E0224 02:56:17.360558 4923 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:17 crc kubenswrapper[4923]: E0224 02:56:17.360594 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:19.360583022 +0000 UTC m=+103.377653925 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.447813 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.447868 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.447892 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.447920 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.447957 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:17Z","lastTransitionTime":"2026-02-24T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.551120 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.551166 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.551175 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.551191 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.551201 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:17Z","lastTransitionTime":"2026-02-24T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.653890 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.653932 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.653940 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.653954 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.653962 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:17Z","lastTransitionTime":"2026-02-24T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.686106 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 17:28:26.550507638 +0000 UTC Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.712472 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:17 crc kubenswrapper[4923]: E0224 02:56:17.712623 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.712477 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:17 crc kubenswrapper[4923]: E0224 02:56:17.712818 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.715925 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.716762 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.717581 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.718342 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.718945 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.720386 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.720925 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.721461 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.722469 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.723067 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.724139 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.724941 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.725130 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:56:17Z is after 2025-08-24T17:21:41Z" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.726026 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.726730 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.727982 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.728576 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.729166 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.730152 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.730760 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.731764 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.732338 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.732888 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.733743 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.734534 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.735695 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.736595 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.737662 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.738224 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:56:17Z is after 2025-08-24T17:21:41Z" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.738334 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.739579 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.740136 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.740753 4923 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.740875 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.742862 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.743434 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.744226 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.745686 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.746317 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.747108 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.747890 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.749006 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.749500 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.750096 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.751102 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.752018 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.752462 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.753417 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.754036 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.755351 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.755925 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.755747 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:56:17Z is after 2025-08-24T17:21:41Z" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.756135 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.756193 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.756218 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.756245 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.756267 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:17Z","lastTransitionTime":"2026-02-24T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.757011 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.757536 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.758223 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.759410 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.759872 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.770018 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:56:17Z is after 2025-08-24T17:21:41Z" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.783828 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e05ac7e421335d34fa358b273a4bb5fce00f0d9e4c6ea8f7c5b42cd4bd2f6f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T02:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d17eb2b89ff5bd37d4215666c63f8a192fcd4d112ca34ba8aadc06a2db88080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T02:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:56:17Z is after 2025-08-24T17:21:41Z" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.799051 4923 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T02:56:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd5d072222355061b24a981452a3d7c1c5a5caef8b6bf491a5cdac138c7b09bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T02:56:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T02:56:17Z is after 2025-08-24T17:21:41Z" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.857614 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.857654 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.857666 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.857685 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.857698 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:17Z","lastTransitionTime":"2026-02-24T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.960290 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.960352 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.960364 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.960382 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:17 crc kubenswrapper[4923]: I0224 02:56:17.960395 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:17Z","lastTransitionTime":"2026-02-24T02:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.062536 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.062572 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.062580 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.062594 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.062602 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:18Z","lastTransitionTime":"2026-02-24T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.165141 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.165192 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.165204 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.165220 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.165231 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:18Z","lastTransitionTime":"2026-02-24T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.269766 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.269801 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.269811 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.269824 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.269836 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:18Z","lastTransitionTime":"2026-02-24T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.372131 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.372391 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.372462 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.372521 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.372585 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:18Z","lastTransitionTime":"2026-02-24T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.475208 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.475254 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.475264 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.475282 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.475310 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:18Z","lastTransitionTime":"2026-02-24T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.577745 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.577784 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.577792 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.577806 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.577817 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:18Z","lastTransitionTime":"2026-02-24T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.680548 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.680594 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.680603 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.680618 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.680628 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:18Z","lastTransitionTime":"2026-02-24T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.686924 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 05:44:59.474855177 +0000 UTC Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.712197 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:18 crc kubenswrapper[4923]: E0224 02:56:18.712323 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.783783 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.783843 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.783859 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.783884 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.783901 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:18Z","lastTransitionTime":"2026-02-24T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.887355 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.887428 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.887446 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.887471 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.887490 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:18Z","lastTransitionTime":"2026-02-24T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.989948 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.990008 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.990027 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.990049 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:18 crc kubenswrapper[4923]: I0224 02:56:18.990065 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:18Z","lastTransitionTime":"2026-02-24T02:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.091906 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.091941 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.091949 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.091961 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.091972 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:19Z","lastTransitionTime":"2026-02-24T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.143082 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"225125317d4ea6db68bf6b17dd4005a39a40450dff55a76fa95745354f4de310"} Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.194169 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.194214 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.194231 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.194253 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.194268 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:19Z","lastTransitionTime":"2026-02-24T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.276750 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.277012 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:23.276967258 +0000 UTC m=+107.294038091 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.296001 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.296042 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.296054 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.296070 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.296083 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:19Z","lastTransitionTime":"2026-02-24T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.378256 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.378347 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.378383 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.378408 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.378486 4923 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.378525 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.378574 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.378577 4923 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.378590 4923 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.378649 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.378691 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.378544 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:23.378526594 +0000 UTC m=+107.395597407 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.378706 4923 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.378736 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:23.378715029 +0000 UTC m=+107.395785842 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.378752 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:23.37874457 +0000 UTC m=+107.395815383 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.378764 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:23.378758471 +0000 UTC m=+107.395829284 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.398163 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.398206 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.398217 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.398233 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.398247 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:19Z","lastTransitionTime":"2026-02-24T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.501362 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.501404 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.501415 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.501430 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.501441 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:19Z","lastTransitionTime":"2026-02-24T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.562776 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-rf6ng"] Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.563107 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rf6ng" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.564835 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.564891 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.565072 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.577646 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-c74d9"] Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.578024 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.579420 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-qxffg"] Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.580148 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-rh26t"] Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.580402 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.580359 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.580531 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.580603 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.582266 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.583925 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.584147 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.586473 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.586534 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.586582 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.586479 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.588836 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.588886 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.589323 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.602110 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-42sfg"] Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.603041 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.603535 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.603567 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.603575 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.603588 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.603599 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:19Z","lastTransitionTime":"2026-02-24T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:19 crc kubenswrapper[4923]: W0224 02:56:19.605368 4923 reflector.go:561] object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.605412 4923 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 02:56:19 crc kubenswrapper[4923]: W0224 02:56:19.605473 4923 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": failed to list *v1.Secret: secrets "ovn-node-metrics-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.605487 4923 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-node-metrics-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 02:56:19 crc kubenswrapper[4923]: W0224 02:56:19.605517 4923 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": failed to list *v1.Secret: secrets "ovn-kubernetes-node-dockercfg-pwtwl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.605527 4923 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pwtwl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-kubernetes-node-dockercfg-pwtwl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 02:56:19 crc kubenswrapper[4923]: W0224 02:56:19.605567 4923 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-config": failed to list *v1.ConfigMap: configmaps "ovnkube-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.605577 4923 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 02:56:19 crc kubenswrapper[4923]: W0224 02:56:19.605615 4923 reflector.go:561] object-"openshift-ovn-kubernetes"/"env-overrides": failed to list *v1.ConfigMap: configmaps "env-overrides" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.605625 4923 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"env-overrides\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"env-overrides\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 02:56:19 crc kubenswrapper[4923]: W0224 02:56:19.605660 4923 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": failed to list *v1.ConfigMap: configmaps "ovnkube-script-lib" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.605672 4923 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-script-lib\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 02:56:19 crc kubenswrapper[4923]: W0224 02:56:19.605712 4923 reflector.go:561] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.605725 4923 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681025 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4b5ab7c9-bfe8-44e9-b4cd-2516d754fd3e-hosts-file\") pod \"node-resolver-rf6ng\" (UID: \"4b5ab7c9-bfe8-44e9-b4cd-2516d754fd3e\") " pod="openshift-dns/node-resolver-rf6ng" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681079 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skztk\" (UniqueName: \"kubernetes.io/projected/4b5ab7c9-bfe8-44e9-b4cd-2516d754fd3e-kube-api-access-skztk\") pod \"node-resolver-rf6ng\" (UID: \"4b5ab7c9-bfe8-44e9-b4cd-2516d754fd3e\") " pod="openshift-dns/node-resolver-rf6ng" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681112 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-host-var-lib-cni-bin\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681135 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dbaad03e-026e-4046-afab-4d26112ad358-cni-binary-copy\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681206 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-etc-openvswitch\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681344 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-system-cni-dir\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681413 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-systemd-units\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681449 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tkvl\" (UniqueName: \"kubernetes.io/projected/4607f544-e6b3-4188-9b33-c638dfb1bda4-kube-api-access-2tkvl\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681489 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-host-run-k8s-cni-cncf-io\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681560 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbaad03e-026e-4046-afab-4d26112ad358-system-cni-dir\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681593 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88dnt\" (UniqueName: \"kubernetes.io/projected/dbaad03e-026e-4046-afab-4d26112ad358-kube-api-access-88dnt\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681656 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovn-node-metrics-cert\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681677 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-cnibin\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681721 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f2467bf1-1ba4-491e-b677-79c589f353ec-rootfs\") pod \"machine-config-daemon-rh26t\" (UID: \"f2467bf1-1ba4-491e-b677-79c589f353ec\") " pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681745 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-kubelet\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681790 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-run-systemd\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681811 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-var-lib-openvswitch\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681829 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f2467bf1-1ba4-491e-b677-79c589f353ec-proxy-tls\") pod \"machine-config-daemon-rh26t\" (UID: \"f2467bf1-1ba4-491e-b677-79c589f353ec\") " pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681879 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-run-ovn-kubernetes\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681935 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovnkube-script-lib\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681961 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-hostroot\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.681983 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckjfl\" (UniqueName: \"kubernetes.io/projected/062c912d-c8a5-4312-b691-dc6488667f7d-kube-api-access-ckjfl\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682014 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f2467bf1-1ba4-491e-b677-79c589f353ec-mcd-auth-proxy-config\") pod \"machine-config-daemon-rh26t\" (UID: \"f2467bf1-1ba4-491e-b677-79c589f353ec\") " pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682088 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-cni-netd\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682128 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/062c912d-c8a5-4312-b691-dc6488667f7d-cni-binary-copy\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682155 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/062c912d-c8a5-4312-b691-dc6488667f7d-multus-daemon-config\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682217 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-run-ovn\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682269 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-etc-kubernetes\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682344 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-host-run-netns\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682374 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dbaad03e-026e-4046-afab-4d26112ad358-os-release\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682438 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dbaad03e-026e-4046-afab-4d26112ad358-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682464 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-slash\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682487 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682509 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-host-var-lib-kubelet\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682555 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-host-run-multus-certs\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682576 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dbaad03e-026e-4046-afab-4d26112ad358-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682602 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-multus-cni-dir\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682653 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-host-var-lib-cni-multus\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682704 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-multus-conf-dir\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682732 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dbaad03e-026e-4046-afab-4d26112ad358-cnibin\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682754 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-run-openvswitch\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682772 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-env-overrides\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682836 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snnwh\" (UniqueName: \"kubernetes.io/projected/f2467bf1-1ba4-491e-b677-79c589f353ec-kube-api-access-snnwh\") pod \"machine-config-daemon-rh26t\" (UID: \"f2467bf1-1ba4-491e-b677-79c589f353ec\") " pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682860 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-run-netns\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682887 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-node-log\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682907 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-cni-bin\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682928 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovnkube-config\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682947 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-os-release\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.682987 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-multus-socket-dir-parent\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.683079 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-log-socket\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.687770 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 20:13:18.095185822 +0000 UTC Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.705880 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.705922 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.705933 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.705957 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.705972 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:19Z","lastTransitionTime":"2026-02-24T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.714620 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.714764 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.715148 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.715212 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.746105 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-kj4ts"] Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.746440 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kj4ts" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.748112 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.748828 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.749106 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.749218 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.784012 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-systemd-units\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.784230 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tkvl\" (UniqueName: \"kubernetes.io/projected/4607f544-e6b3-4188-9b33-c638dfb1bda4-kube-api-access-2tkvl\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.784357 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-host-run-k8s-cni-cncf-io\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.784177 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-systemd-units\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.784402 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-host-run-k8s-cni-cncf-io\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.784438 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbaad03e-026e-4046-afab-4d26112ad358-system-cni-dir\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.784522 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88dnt\" (UniqueName: \"kubernetes.io/projected/dbaad03e-026e-4046-afab-4d26112ad358-kube-api-access-88dnt\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.784543 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f2467bf1-1ba4-491e-b677-79c589f353ec-rootfs\") pod \"machine-config-daemon-rh26t\" (UID: \"f2467bf1-1ba4-491e-b677-79c589f353ec\") " pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.784583 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovn-node-metrics-cert\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.784672 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f2467bf1-1ba4-491e-b677-79c589f353ec-rootfs\") pod \"machine-config-daemon-rh26t\" (UID: \"f2467bf1-1ba4-491e-b677-79c589f353ec\") " pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.784730 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-cnibin\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.784801 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbaad03e-026e-4046-afab-4d26112ad358-system-cni-dir\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.784840 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-cnibin\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.784986 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-kubelet\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.785054 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-kubelet\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.785018 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-run-systemd\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.785104 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-var-lib-openvswitch\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.785121 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckjfl\" (UniqueName: \"kubernetes.io/projected/062c912d-c8a5-4312-b691-dc6488667f7d-kube-api-access-ckjfl\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.785157 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-run-systemd\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.785182 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-var-lib-openvswitch\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.785206 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f2467bf1-1ba4-491e-b677-79c589f353ec-proxy-tls\") pod \"machine-config-daemon-rh26t\" (UID: \"f2467bf1-1ba4-491e-b677-79c589f353ec\") " pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.785237 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-run-ovn-kubernetes\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.785685 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-run-ovn-kubernetes\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786080 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovnkube-script-lib\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786121 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-hostroot\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786168 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f2467bf1-1ba4-491e-b677-79c589f353ec-mcd-auth-proxy-config\") pod \"machine-config-daemon-rh26t\" (UID: \"f2467bf1-1ba4-491e-b677-79c589f353ec\") " pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786209 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-cni-netd\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786234 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/062c912d-c8a5-4312-b691-dc6488667f7d-cni-binary-copy\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786256 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/062c912d-c8a5-4312-b691-dc6488667f7d-multus-daemon-config\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786280 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-run-ovn\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786333 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-etc-kubernetes\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786356 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-host-run-netns\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786371 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dbaad03e-026e-4046-afab-4d26112ad358-os-release\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786387 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dbaad03e-026e-4046-afab-4d26112ad358-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786408 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-slash\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786426 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786444 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-host-var-lib-kubelet\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786459 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-host-run-multus-certs\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786475 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dbaad03e-026e-4046-afab-4d26112ad358-cnibin\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786490 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dbaad03e-026e-4046-afab-4d26112ad358-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786512 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-multus-cni-dir\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786527 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-host-var-lib-cni-multus\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786542 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-multus-conf-dir\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786561 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-run-openvswitch\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786578 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-env-overrides\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786594 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-multus-socket-dir-parent\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786610 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snnwh\" (UniqueName: \"kubernetes.io/projected/f2467bf1-1ba4-491e-b677-79c589f353ec-kube-api-access-snnwh\") pod \"machine-config-daemon-rh26t\" (UID: \"f2467bf1-1ba4-491e-b677-79c589f353ec\") " pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786626 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-run-netns\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786640 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-node-log\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786653 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-cni-bin\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786667 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovnkube-config\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786680 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-os-release\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786694 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-log-socket\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786714 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4b5ab7c9-bfe8-44e9-b4cd-2516d754fd3e-hosts-file\") pod \"node-resolver-rf6ng\" (UID: \"4b5ab7c9-bfe8-44e9-b4cd-2516d754fd3e\") " pod="openshift-dns/node-resolver-rf6ng" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786727 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skztk\" (UniqueName: \"kubernetes.io/projected/4b5ab7c9-bfe8-44e9-b4cd-2516d754fd3e-kube-api-access-skztk\") pod \"node-resolver-rf6ng\" (UID: \"4b5ab7c9-bfe8-44e9-b4cd-2516d754fd3e\") " pod="openshift-dns/node-resolver-rf6ng" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786759 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-host-var-lib-cni-bin\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786776 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dbaad03e-026e-4046-afab-4d26112ad358-cni-binary-copy\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786795 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-etc-openvswitch\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786822 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-system-cni-dir\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786906 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-system-cni-dir\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.786980 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-hostroot\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787245 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-host-var-lib-cni-multus\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787284 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-cni-netd\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787304 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-cni-bin\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787362 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4b5ab7c9-bfe8-44e9-b4cd-2516d754fd3e-hosts-file\") pod \"node-resolver-rf6ng\" (UID: \"4b5ab7c9-bfe8-44e9-b4cd-2516d754fd3e\") " pod="openshift-dns/node-resolver-rf6ng" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787387 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-multus-conf-dir\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787389 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-os-release\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787413 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-log-socket\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787433 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-run-openvswitch\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787447 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-host-var-lib-cni-bin\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787473 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-run-netns\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787489 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-node-log\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787533 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-etc-kubernetes\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787568 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f2467bf1-1ba4-491e-b677-79c589f353ec-mcd-auth-proxy-config\") pod \"machine-config-daemon-rh26t\" (UID: \"f2467bf1-1ba4-491e-b677-79c589f353ec\") " pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787572 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-run-ovn\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787567 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-multus-socket-dir-parent\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787615 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-etc-openvswitch\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787609 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-host-run-netns\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787698 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-slash\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787704 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-host-run-multus-certs\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787732 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787732 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dbaad03e-026e-4046-afab-4d26112ad358-os-release\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787767 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dbaad03e-026e-4046-afab-4d26112ad358-cnibin\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787766 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-host-var-lib-kubelet\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.787856 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/062c912d-c8a5-4312-b691-dc6488667f7d-multus-cni-dir\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.788158 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dbaad03e-026e-4046-afab-4d26112ad358-cni-binary-copy\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.788252 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dbaad03e-026e-4046-afab-4d26112ad358-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.788260 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dbaad03e-026e-4046-afab-4d26112ad358-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.788287 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/062c912d-c8a5-4312-b691-dc6488667f7d-multus-daemon-config\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.788619 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/062c912d-c8a5-4312-b691-dc6488667f7d-cni-binary-copy\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.790038 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f2467bf1-1ba4-491e-b677-79c589f353ec-proxy-tls\") pod \"machine-config-daemon-rh26t\" (UID: \"f2467bf1-1ba4-491e-b677-79c589f353ec\") " pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.803441 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88dnt\" (UniqueName: \"kubernetes.io/projected/dbaad03e-026e-4046-afab-4d26112ad358-kube-api-access-88dnt\") pod \"multus-additional-cni-plugins-qxffg\" (UID: \"dbaad03e-026e-4046-afab-4d26112ad358\") " pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.805219 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckjfl\" (UniqueName: \"kubernetes.io/projected/062c912d-c8a5-4312-b691-dc6488667f7d-kube-api-access-ckjfl\") pod \"multus-c74d9\" (UID: \"062c912d-c8a5-4312-b691-dc6488667f7d\") " pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.806452 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skztk\" (UniqueName: \"kubernetes.io/projected/4b5ab7c9-bfe8-44e9-b4cd-2516d754fd3e-kube-api-access-skztk\") pod \"node-resolver-rf6ng\" (UID: \"4b5ab7c9-bfe8-44e9-b4cd-2516d754fd3e\") " pod="openshift-dns/node-resolver-rf6ng" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.808116 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snnwh\" (UniqueName: \"kubernetes.io/projected/f2467bf1-1ba4-491e-b677-79c589f353ec-kube-api-access-snnwh\") pod \"machine-config-daemon-rh26t\" (UID: \"f2467bf1-1ba4-491e-b677-79c589f353ec\") " pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.808148 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.808188 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.808199 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.808214 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.808224 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:19Z","lastTransitionTime":"2026-02-24T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.881741 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rf6ng" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.888404 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a69f86b2-c539-4c9b-8ebe-6136b97f22af-serviceca\") pod \"node-ca-kj4ts\" (UID: \"a69f86b2-c539-4c9b-8ebe-6136b97f22af\") " pod="openshift-image-registry/node-ca-kj4ts" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.888461 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a69f86b2-c539-4c9b-8ebe-6136b97f22af-host\") pod \"node-ca-kj4ts\" (UID: \"a69f86b2-c539-4c9b-8ebe-6136b97f22af\") " pod="openshift-image-registry/node-ca-kj4ts" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.888499 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l87mk\" (UniqueName: \"kubernetes.io/projected/a69f86b2-c539-4c9b-8ebe-6136b97f22af-kube-api-access-l87mk\") pod \"node-ca-kj4ts\" (UID: \"a69f86b2-c539-4c9b-8ebe-6136b97f22af\") " pod="openshift-image-registry/node-ca-kj4ts" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.890887 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv"] Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.891407 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv" Feb 24 02:56:19 crc kubenswrapper[4923]: W0224 02:56:19.893194 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b5ab7c9_bfe8_44e9_b4cd_2516d754fd3e.slice/crio-911bce5201e768c770af029c584e63b8e2f534390bee1aa6fd1b42d9f9b7c53e WatchSource:0}: Error finding container 911bce5201e768c770af029c584e63b8e2f534390bee1aa6fd1b42d9f9b7c53e: Status 404 returned error can't find the container with id 911bce5201e768c770af029c584e63b8e2f534390bee1aa6fd1b42d9f9b7c53e Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.893376 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.893663 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.904476 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c74d9" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.911264 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.911334 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.911345 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.911360 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.911369 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:19Z","lastTransitionTime":"2026-02-24T02:56:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.915180 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.919315 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-pl8mp"] Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.920062 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:19 crc kubenswrapper[4923]: E0224 02:56:19.920117 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pl8mp" podUID="fa5ec917-061e-4f9c-8930-994239908f27" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.929875 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qxffg" Feb 24 02:56:19 crc kubenswrapper[4923]: W0224 02:56:19.930575 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod062c912d_c8a5_4312_b691_dc6488667f7d.slice/crio-f8e6b4612c23a7b2259f57ae3b8c227addbf6924097b943b7cb127a935a098f5 WatchSource:0}: Error finding container f8e6b4612c23a7b2259f57ae3b8c227addbf6924097b943b7cb127a935a098f5: Status 404 returned error can't find the container with id f8e6b4612c23a7b2259f57ae3b8c227addbf6924097b943b7cb127a935a098f5 Feb 24 02:56:19 crc kubenswrapper[4923]: W0224 02:56:19.950031 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbaad03e_026e_4046_afab_4d26112ad358.slice/crio-5ffcdc1e5f4fe07eb4e1f1eea0d7385caf8c4adce26a3a909286825f0420e5e9 WatchSource:0}: Error finding container 5ffcdc1e5f4fe07eb4e1f1eea0d7385caf8c4adce26a3a909286825f0420e5e9: Status 404 returned error can't find the container with id 5ffcdc1e5f4fe07eb4e1f1eea0d7385caf8c4adce26a3a909286825f0420e5e9 Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.989097 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l87mk\" (UniqueName: \"kubernetes.io/projected/a69f86b2-c539-4c9b-8ebe-6136b97f22af-kube-api-access-l87mk\") pod \"node-ca-kj4ts\" (UID: \"a69f86b2-c539-4c9b-8ebe-6136b97f22af\") " pod="openshift-image-registry/node-ca-kj4ts" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.989144 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf5kf\" (UniqueName: \"kubernetes.io/projected/fa5ec917-061e-4f9c-8930-994239908f27-kube-api-access-rf5kf\") pod \"network-metrics-daemon-pl8mp\" (UID: \"fa5ec917-061e-4f9c-8930-994239908f27\") " pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.989184 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn5g4\" (UniqueName: \"kubernetes.io/projected/daddc4ee-a15c-4f20-9371-af144af7a8a7-kube-api-access-gn5g4\") pod \"ovnkube-control-plane-749d76644c-rkqsv\" (UID: \"daddc4ee-a15c-4f20-9371-af144af7a8a7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.989218 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/daddc4ee-a15c-4f20-9371-af144af7a8a7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rkqsv\" (UID: \"daddc4ee-a15c-4f20-9371-af144af7a8a7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.989274 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/daddc4ee-a15c-4f20-9371-af144af7a8a7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rkqsv\" (UID: \"daddc4ee-a15c-4f20-9371-af144af7a8a7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.989314 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/daddc4ee-a15c-4f20-9371-af144af7a8a7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rkqsv\" (UID: \"daddc4ee-a15c-4f20-9371-af144af7a8a7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.989405 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa5ec917-061e-4f9c-8930-994239908f27-metrics-certs\") pod \"network-metrics-daemon-pl8mp\" (UID: \"fa5ec917-061e-4f9c-8930-994239908f27\") " pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.989464 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a69f86b2-c539-4c9b-8ebe-6136b97f22af-serviceca\") pod \"node-ca-kj4ts\" (UID: \"a69f86b2-c539-4c9b-8ebe-6136b97f22af\") " pod="openshift-image-registry/node-ca-kj4ts" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.989500 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a69f86b2-c539-4c9b-8ebe-6136b97f22af-host\") pod \"node-ca-kj4ts\" (UID: \"a69f86b2-c539-4c9b-8ebe-6136b97f22af\") " pod="openshift-image-registry/node-ca-kj4ts" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.989577 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a69f86b2-c539-4c9b-8ebe-6136b97f22af-host\") pod \"node-ca-kj4ts\" (UID: \"a69f86b2-c539-4c9b-8ebe-6136b97f22af\") " pod="openshift-image-registry/node-ca-kj4ts" Feb 24 02:56:19 crc kubenswrapper[4923]: I0224 02:56:19.990364 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a69f86b2-c539-4c9b-8ebe-6136b97f22af-serviceca\") pod \"node-ca-kj4ts\" (UID: \"a69f86b2-c539-4c9b-8ebe-6136b97f22af\") " pod="openshift-image-registry/node-ca-kj4ts" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.012839 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.012972 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.013036 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.013055 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.013068 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:20Z","lastTransitionTime":"2026-02-24T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.017248 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l87mk\" (UniqueName: \"kubernetes.io/projected/a69f86b2-c539-4c9b-8ebe-6136b97f22af-kube-api-access-l87mk\") pod \"node-ca-kj4ts\" (UID: \"a69f86b2-c539-4c9b-8ebe-6136b97f22af\") " pod="openshift-image-registry/node-ca-kj4ts" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.057612 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kj4ts" Feb 24 02:56:20 crc kubenswrapper[4923]: W0224 02:56:20.071976 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda69f86b2_c539_4c9b_8ebe_6136b97f22af.slice/crio-bd8a8a8fd7ac30a88011e527962c690614c7afee899fb2f0197b74bba2c01bcc WatchSource:0}: Error finding container bd8a8a8fd7ac30a88011e527962c690614c7afee899fb2f0197b74bba2c01bcc: Status 404 returned error can't find the container with id bd8a8a8fd7ac30a88011e527962c690614c7afee899fb2f0197b74bba2c01bcc Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.089890 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn5g4\" (UniqueName: \"kubernetes.io/projected/daddc4ee-a15c-4f20-9371-af144af7a8a7-kube-api-access-gn5g4\") pod \"ovnkube-control-plane-749d76644c-rkqsv\" (UID: \"daddc4ee-a15c-4f20-9371-af144af7a8a7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.089927 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/daddc4ee-a15c-4f20-9371-af144af7a8a7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rkqsv\" (UID: \"daddc4ee-a15c-4f20-9371-af144af7a8a7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.089992 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/daddc4ee-a15c-4f20-9371-af144af7a8a7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rkqsv\" (UID: \"daddc4ee-a15c-4f20-9371-af144af7a8a7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.090009 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/daddc4ee-a15c-4f20-9371-af144af7a8a7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rkqsv\" (UID: \"daddc4ee-a15c-4f20-9371-af144af7a8a7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.090134 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa5ec917-061e-4f9c-8930-994239908f27-metrics-certs\") pod \"network-metrics-daemon-pl8mp\" (UID: \"fa5ec917-061e-4f9c-8930-994239908f27\") " pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.090193 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf5kf\" (UniqueName: \"kubernetes.io/projected/fa5ec917-061e-4f9c-8930-994239908f27-kube-api-access-rf5kf\") pod \"network-metrics-daemon-pl8mp\" (UID: \"fa5ec917-061e-4f9c-8930-994239908f27\") " pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:20 crc kubenswrapper[4923]: E0224 02:56:20.090274 4923 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 02:56:20 crc kubenswrapper[4923]: E0224 02:56:20.090369 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa5ec917-061e-4f9c-8930-994239908f27-metrics-certs podName:fa5ec917-061e-4f9c-8930-994239908f27 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:20.590349131 +0000 UTC m=+104.607419944 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa5ec917-061e-4f9c-8930-994239908f27-metrics-certs") pod "network-metrics-daemon-pl8mp" (UID: "fa5ec917-061e-4f9c-8930-994239908f27") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.093335 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/daddc4ee-a15c-4f20-9371-af144af7a8a7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rkqsv\" (UID: \"daddc4ee-a15c-4f20-9371-af144af7a8a7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.106705 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf5kf\" (UniqueName: \"kubernetes.io/projected/fa5ec917-061e-4f9c-8930-994239908f27-kube-api-access-rf5kf\") pod \"network-metrics-daemon-pl8mp\" (UID: \"fa5ec917-061e-4f9c-8930-994239908f27\") " pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.115105 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.115141 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.115152 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.115169 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.115179 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:20Z","lastTransitionTime":"2026-02-24T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.146062 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qxffg" event={"ID":"dbaad03e-026e-4046-afab-4d26112ad358","Type":"ContainerStarted","Data":"5ffcdc1e5f4fe07eb4e1f1eea0d7385caf8c4adce26a3a909286825f0420e5e9"} Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.148128 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c74d9" event={"ID":"062c912d-c8a5-4312-b691-dc6488667f7d","Type":"ContainerStarted","Data":"6195211d6b7a3b2c1cb8715f8b02764e7dff4105d615755005c85b97f1785a08"} Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.148157 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c74d9" event={"ID":"062c912d-c8a5-4312-b691-dc6488667f7d","Type":"ContainerStarted","Data":"f8e6b4612c23a7b2259f57ae3b8c227addbf6924097b943b7cb127a935a098f5"} Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.149237 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kj4ts" event={"ID":"a69f86b2-c539-4c9b-8ebe-6136b97f22af","Type":"ContainerStarted","Data":"bd8a8a8fd7ac30a88011e527962c690614c7afee899fb2f0197b74bba2c01bcc"} Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.150930 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rf6ng" event={"ID":"4b5ab7c9-bfe8-44e9-b4cd-2516d754fd3e","Type":"ContainerStarted","Data":"0fae1ca0a37f9d532cd6cce6104ddf0d0127b170fb6a5042c75818bf54db3002"} Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.150968 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rf6ng" event={"ID":"4b5ab7c9-bfe8-44e9-b4cd-2516d754fd3e","Type":"ContainerStarted","Data":"911bce5201e768c770af029c584e63b8e2f534390bee1aa6fd1b42d9f9b7c53e"} Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.153235 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerStarted","Data":"92a19cc205b64e61b6c65d1f93e5df48760062306a031253913e5f685cebe0c6"} Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.153276 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerStarted","Data":"2fba1c60062fe1dacc6acce64fb97479eee447850d91da02b181099208b92b93"} Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.174781 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-c74d9" podStartSLOduration=28.174758905 podStartE2EDuration="28.174758905s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:20.162984135 +0000 UTC m=+104.180054948" watchObservedRunningTime="2026-02-24 02:56:20.174758905 +0000 UTC m=+104.191829718" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.217688 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.217728 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.217736 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.217751 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.217759 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:20Z","lastTransitionTime":"2026-02-24T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.320223 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.320263 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.320273 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.320290 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.320317 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:20Z","lastTransitionTime":"2026-02-24T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.422498 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.422523 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.422531 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.422543 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.422551 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:20Z","lastTransitionTime":"2026-02-24T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.476062 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.524508 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.524580 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.524590 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.524604 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.524654 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:20Z","lastTransitionTime":"2026-02-24T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.596565 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa5ec917-061e-4f9c-8930-994239908f27-metrics-certs\") pod \"network-metrics-daemon-pl8mp\" (UID: \"fa5ec917-061e-4f9c-8930-994239908f27\") " pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:20 crc kubenswrapper[4923]: E0224 02:56:20.596678 4923 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 02:56:20 crc kubenswrapper[4923]: E0224 02:56:20.596741 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa5ec917-061e-4f9c-8930-994239908f27-metrics-certs podName:fa5ec917-061e-4f9c-8930-994239908f27 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:21.596720453 +0000 UTC m=+105.613791266 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa5ec917-061e-4f9c-8930-994239908f27-metrics-certs") pod "network-metrics-daemon-pl8mp" (UID: "fa5ec917-061e-4f9c-8930-994239908f27") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.627403 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.627449 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.627466 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.627489 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.627508 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:20Z","lastTransitionTime":"2026-02-24T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.650769 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.675402 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.679001 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovnkube-config\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.681193 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/daddc4ee-a15c-4f20-9371-af144af7a8a7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rkqsv\" (UID: \"daddc4ee-a15c-4f20-9371-af144af7a8a7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.688778 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 04:57:25.57027065 +0000 UTC Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.712569 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:20 crc kubenswrapper[4923]: E0224 02:56:20.712741 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.729496 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.729530 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.729540 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.729555 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.729564 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:20Z","lastTransitionTime":"2026-02-24T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:20 crc kubenswrapper[4923]: E0224 02:56:20.785316 4923 secret.go:188] Couldn't get secret openshift-ovn-kubernetes/ovn-node-metrics-cert: failed to sync secret cache: timed out waiting for the condition Feb 24 02:56:20 crc kubenswrapper[4923]: E0224 02:56:20.785463 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovn-node-metrics-cert podName:4607f544-e6b3-4188-9b33-c638dfb1bda4 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:21.285430815 +0000 UTC m=+105.302501668 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovn-node-metrics-cert" (UniqueName: "kubernetes.io/secret/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovn-node-metrics-cert") pod "ovnkube-node-42sfg" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4") : failed to sync secret cache: timed out waiting for the condition Feb 24 02:56:20 crc kubenswrapper[4923]: E0224 02:56:20.787374 4923 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/ovnkube-script-lib: failed to sync configmap cache: timed out waiting for the condition Feb 24 02:56:20 crc kubenswrapper[4923]: E0224 02:56:20.787464 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovnkube-script-lib podName:4607f544-e6b3-4188-9b33-c638dfb1bda4 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:21.287445158 +0000 UTC m=+105.304515971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovnkube-script-lib" (UniqueName: "kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovnkube-script-lib") pod "ovnkube-node-42sfg" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4") : failed to sync configmap cache: timed out waiting for the condition Feb 24 02:56:20 crc kubenswrapper[4923]: E0224 02:56:20.787528 4923 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/env-overrides: failed to sync configmap cache: timed out waiting for the condition Feb 24 02:56:20 crc kubenswrapper[4923]: E0224 02:56:20.787600 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-env-overrides podName:4607f544-e6b3-4188-9b33-c638dfb1bda4 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:21.287582182 +0000 UTC m=+105.304653035 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "env-overrides" (UniqueName: "kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-env-overrides") pod "ovnkube-node-42sfg" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4") : failed to sync configmap cache: timed out waiting for the condition Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.833092 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.833129 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.833138 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.833150 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.833158 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:20Z","lastTransitionTime":"2026-02-24T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.865245 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.871182 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/daddc4ee-a15c-4f20-9371-af144af7a8a7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rkqsv\" (UID: \"daddc4ee-a15c-4f20-9371-af144af7a8a7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.935216 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.935258 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.935276 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.935319 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.935335 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:20Z","lastTransitionTime":"2026-02-24T02:56:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:20 crc kubenswrapper[4923]: I0224 02:56:20.989109 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.037679 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.037745 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.037756 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.037769 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.037780 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:21Z","lastTransitionTime":"2026-02-24T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.061149 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.065261 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tkvl\" (UniqueName: \"kubernetes.io/projected/4607f544-e6b3-4188-9b33-c638dfb1bda4-kube-api-access-2tkvl\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.067193 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn5g4\" (UniqueName: \"kubernetes.io/projected/daddc4ee-a15c-4f20-9371-af144af7a8a7-kube-api-access-gn5g4\") pod \"ovnkube-control-plane-749d76644c-rkqsv\" (UID: \"daddc4ee-a15c-4f20-9371-af144af7a8a7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.091548 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.125489 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.140208 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.140256 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.140268 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.140283 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.140308 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:21Z","lastTransitionTime":"2026-02-24T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:21 crc kubenswrapper[4923]: W0224 02:56:21.143420 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaddc4ee_a15c_4f20_9371_af144af7a8a7.slice/crio-5d832deb549cea1f0c43582480378bd2a066731880121993eb12b278a7692842 WatchSource:0}: Error finding container 5d832deb549cea1f0c43582480378bd2a066731880121993eb12b278a7692842: Status 404 returned error can't find the container with id 5d832deb549cea1f0c43582480378bd2a066731880121993eb12b278a7692842 Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.162947 4923 generic.go:334] "Generic (PLEG): container finished" podID="dbaad03e-026e-4046-afab-4d26112ad358" containerID="6313089e5ee7d5bcba3fb08665979ec657e1d45f407a13968c1d3312898edd1c" exitCode=0 Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.163420 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qxffg" event={"ID":"dbaad03e-026e-4046-afab-4d26112ad358","Type":"ContainerDied","Data":"6313089e5ee7d5bcba3fb08665979ec657e1d45f407a13968c1d3312898edd1c"} Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.170313 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kj4ts" event={"ID":"a69f86b2-c539-4c9b-8ebe-6136b97f22af","Type":"ContainerStarted","Data":"f2acfa8b1aa02d0760ffc3239df580cb2652ff6cfb16817fcdb3082b7aa15571"} Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.172629 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv" event={"ID":"daddc4ee-a15c-4f20-9371-af144af7a8a7","Type":"ContainerStarted","Data":"5d832deb549cea1f0c43582480378bd2a066731880121993eb12b278a7692842"} Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.182699 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerStarted","Data":"bcc6d64628024b3cd8aee1c89f7c6452e6b36989a6f64498c4cedec5a2b87cdc"} Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.183691 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rf6ng" podStartSLOduration=29.183680859 podStartE2EDuration="29.183680859s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:20.175366871 +0000 UTC m=+104.192437684" watchObservedRunningTime="2026-02-24 02:56:21.183680859 +0000 UTC m=+105.200751672" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.214320 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podStartSLOduration=29.214284666 podStartE2EDuration="29.214284666s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:21.213081264 +0000 UTC m=+105.230152097" watchObservedRunningTime="2026-02-24 02:56:21.214284666 +0000 UTC m=+105.231355479" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.214425 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kj4ts" podStartSLOduration=29.214420019 podStartE2EDuration="29.214420019s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:21.198787607 +0000 UTC m=+105.215858440" watchObservedRunningTime="2026-02-24 02:56:21.214420019 +0000 UTC m=+105.231490832" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.244085 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.244146 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.244154 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.244168 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.244179 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:21Z","lastTransitionTime":"2026-02-24T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.309277 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovn-node-metrics-cert\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.309396 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovnkube-script-lib\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.309464 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-env-overrides\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.310308 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovnkube-script-lib\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.310377 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-env-overrides\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.314162 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovn-node-metrics-cert\") pod \"ovnkube-node-42sfg\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.346649 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.346682 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.346691 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.346705 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.346715 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:21Z","lastTransitionTime":"2026-02-24T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.436661 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.449189 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.449226 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.449237 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.449252 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.449263 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:21Z","lastTransitionTime":"2026-02-24T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:21 crc kubenswrapper[4923]: W0224 02:56:21.465101 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4607f544_e6b3_4188_9b33_c638dfb1bda4.slice/crio-65aca8e9e8534a2b1ab71856f9bba823f8c26742e718111f12b0252c55de15b6 WatchSource:0}: Error finding container 65aca8e9e8534a2b1ab71856f9bba823f8c26742e718111f12b0252c55de15b6: Status 404 returned error can't find the container with id 65aca8e9e8534a2b1ab71856f9bba823f8c26742e718111f12b0252c55de15b6 Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.551546 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.551587 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.551599 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.551615 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.551625 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:21Z","lastTransitionTime":"2026-02-24T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.612460 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa5ec917-061e-4f9c-8930-994239908f27-metrics-certs\") pod \"network-metrics-daemon-pl8mp\" (UID: \"fa5ec917-061e-4f9c-8930-994239908f27\") " pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:21 crc kubenswrapper[4923]: E0224 02:56:21.612605 4923 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 02:56:21 crc kubenswrapper[4923]: E0224 02:56:21.612655 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa5ec917-061e-4f9c-8930-994239908f27-metrics-certs podName:fa5ec917-061e-4f9c-8930-994239908f27 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:23.612642102 +0000 UTC m=+107.629712915 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa5ec917-061e-4f9c-8930-994239908f27-metrics-certs") pod "network-metrics-daemon-pl8mp" (UID: "fa5ec917-061e-4f9c-8930-994239908f27") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.654462 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.654517 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.654531 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.654550 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.654565 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:21Z","lastTransitionTime":"2026-02-24T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.689986 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 23:15:37.907829027 +0000 UTC Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.712516 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:21 crc kubenswrapper[4923]: E0224 02:56:21.715774 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.714290 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:21 crc kubenswrapper[4923]: E0224 02:56:21.716001 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pl8mp" podUID="fa5ec917-061e-4f9c-8930-994239908f27" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.716321 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:21 crc kubenswrapper[4923]: E0224 02:56:21.716423 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.757729 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.757769 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.757780 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.757797 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.757808 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:21Z","lastTransitionTime":"2026-02-24T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.860421 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.860713 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.860833 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.860915 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.860997 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:21Z","lastTransitionTime":"2026-02-24T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.963966 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.964126 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.964230 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.964360 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:21 crc kubenswrapper[4923]: I0224 02:56:21.964430 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:21Z","lastTransitionTime":"2026-02-24T02:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.067580 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.067947 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.067955 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.067970 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.067980 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:22Z","lastTransitionTime":"2026-02-24T02:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.170571 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.170613 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.170621 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.170635 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.170646 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:22Z","lastTransitionTime":"2026-02-24T02:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.187521 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv" event={"ID":"daddc4ee-a15c-4f20-9371-af144af7a8a7","Type":"ContainerStarted","Data":"12acc86672e6dcd3a277a514675463dc26d6dfac330f7fc2143617694e8c3722"} Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.187573 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv" event={"ID":"daddc4ee-a15c-4f20-9371-af144af7a8a7","Type":"ContainerStarted","Data":"b729a0b93667e9652eb04ec40effac7fc10caff445343925db5c0479c5507839"} Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.189263 4923 generic.go:334] "Generic (PLEG): container finished" podID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerID="4e6c422888a838b09f87f0610515692a5e75561f0c1b398cd083abdde442ed77" exitCode=0 Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.189337 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerDied","Data":"4e6c422888a838b09f87f0610515692a5e75561f0c1b398cd083abdde442ed77"} Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.189389 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerStarted","Data":"65aca8e9e8534a2b1ab71856f9bba823f8c26742e718111f12b0252c55de15b6"} Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.191808 4923 generic.go:334] "Generic (PLEG): container finished" podID="dbaad03e-026e-4046-afab-4d26112ad358" containerID="6bcacac5102fd60d177ed79a5a7c2535660f61e7736e837bc5e8c123980fee6e" exitCode=0 Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.191846 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qxffg" event={"ID":"dbaad03e-026e-4046-afab-4d26112ad358","Type":"ContainerDied","Data":"6bcacac5102fd60d177ed79a5a7c2535660f61e7736e837bc5e8c123980fee6e"} Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.203651 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rkqsv" podStartSLOduration=30.203632615 podStartE2EDuration="30.203632615s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:22.202745482 +0000 UTC m=+106.219816335" watchObservedRunningTime="2026-02-24 02:56:22.203632615 +0000 UTC m=+106.220703448" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.273025 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.273066 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.273076 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.273090 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.273100 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:22Z","lastTransitionTime":"2026-02-24T02:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.375257 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.375308 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.375317 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.375335 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.375345 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:22Z","lastTransitionTime":"2026-02-24T02:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.477951 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.477994 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.478005 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.478024 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.478037 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:22Z","lastTransitionTime":"2026-02-24T02:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.580048 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.580080 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.580089 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.580108 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.580117 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:22Z","lastTransitionTime":"2026-02-24T02:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.682210 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.682250 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.682263 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.682278 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.682288 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:22Z","lastTransitionTime":"2026-02-24T02:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.690798 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 23:31:49.449869732 +0000 UTC Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.712405 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:22 crc kubenswrapper[4923]: E0224 02:56:22.712505 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.784137 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.784445 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.784459 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.784476 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.784488 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:22Z","lastTransitionTime":"2026-02-24T02:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.888500 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.888564 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.888583 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.888607 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.888633 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:22Z","lastTransitionTime":"2026-02-24T02:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.991770 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.991799 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.991807 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.991821 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:22 crc kubenswrapper[4923]: I0224 02:56:22.991831 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:22Z","lastTransitionTime":"2026-02-24T02:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.093703 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.093748 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.093761 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.093778 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.093793 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:23Z","lastTransitionTime":"2026-02-24T02:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.195483 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.195519 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.195532 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.195547 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.195556 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:23Z","lastTransitionTime":"2026-02-24T02:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.197170 4923 generic.go:334] "Generic (PLEG): container finished" podID="dbaad03e-026e-4046-afab-4d26112ad358" containerID="cdc1bff2e7dbd6aa3cb0dcac3090542ed8754e5caff360d49f0a0cdacf587b09" exitCode=0 Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.197239 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qxffg" event={"ID":"dbaad03e-026e-4046-afab-4d26112ad358","Type":"ContainerDied","Data":"cdc1bff2e7dbd6aa3cb0dcac3090542ed8754e5caff360d49f0a0cdacf587b09"} Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.203556 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerStarted","Data":"0762e472a6d74f5219ca62ad13723c38f97f1ba9f52157ecf228d84623eaf3d5"} Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.203615 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerStarted","Data":"55d1c59325bdc8422882f9054db3a2a544f0676067e2d657be7563a9d41a8e5d"} Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.203637 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerStarted","Data":"e8fcb85b60d31246ae99915253a8e7f54f97f112576ce0f72298af111bfd8913"} Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.203656 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerStarted","Data":"bfb689c9f3fdd896e5b3f0cae8dfad3202a2fdace3d62893ab7998a2e8a89c9e"} Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.203674 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerStarted","Data":"96ba114882b2a7940916954fd6b959b6aea49bb56099cb8754a4acda08ee1f4a"} Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.203691 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerStarted","Data":"83f7a60316410df3d5d5e554238f282a294d98d0a3801e5d4a1fd983d3a778a1"} Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.298757 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.298801 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.298813 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.298835 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.298849 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:23Z","lastTransitionTime":"2026-02-24T02:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.335391 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.335576 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:31.335545501 +0000 UTC m=+115.352616314 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.400549 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.400585 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.400596 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.400612 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.400624 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:23Z","lastTransitionTime":"2026-02-24T02:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.436284 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.436362 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.436389 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.436412 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.436502 4923 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.436511 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.436533 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.436544 4923 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.436554 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.436588 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.436600 4923 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.436502 4923 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.436565 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:31.436547962 +0000 UTC m=+115.453618775 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.436682 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:31.436665825 +0000 UTC m=+115.453736638 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.436694 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:31.436688546 +0000 UTC m=+115.453759359 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.436707 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:31.436700336 +0000 UTC m=+115.453771149 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.502638 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.502673 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.502682 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.502697 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.502706 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:23Z","lastTransitionTime":"2026-02-24T02:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.605137 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.605195 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.605212 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.605239 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.605257 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:23Z","lastTransitionTime":"2026-02-24T02:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.638698 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa5ec917-061e-4f9c-8930-994239908f27-metrics-certs\") pod \"network-metrics-daemon-pl8mp\" (UID: \"fa5ec917-061e-4f9c-8930-994239908f27\") " pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.638838 4923 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.638889 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa5ec917-061e-4f9c-8930-994239908f27-metrics-certs podName:fa5ec917-061e-4f9c-8930-994239908f27 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:27.638871783 +0000 UTC m=+111.655942596 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa5ec917-061e-4f9c-8930-994239908f27-metrics-certs") pod "network-metrics-daemon-pl8mp" (UID: "fa5ec917-061e-4f9c-8930-994239908f27") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.691412 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 12:08:09.959254446 +0000 UTC Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.708506 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.708539 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.708551 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.708569 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.708580 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:23Z","lastTransitionTime":"2026-02-24T02:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.713497 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.713689 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pl8mp" podUID="fa5ec917-061e-4f9c-8930-994239908f27" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.715025 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.715036 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.715130 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.715209 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.726068 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.726613 4923 scope.go:117] "RemoveContainer" containerID="ff398e67eccf34b9e0bc1d34f2285b4a275efa099b3a0891887c9ca8e979d39c" Feb 24 02:56:23 crc kubenswrapper[4923]: E0224 02:56:23.726794 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.811384 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.811413 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.811423 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.811436 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.811444 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:23Z","lastTransitionTime":"2026-02-24T02:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.914863 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.914912 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.914931 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.914956 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:23 crc kubenswrapper[4923]: I0224 02:56:23.914973 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:23Z","lastTransitionTime":"2026-02-24T02:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.016886 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.016935 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.016948 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.016965 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.016976 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:24Z","lastTransitionTime":"2026-02-24T02:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.119843 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.119878 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.119887 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.119899 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.119909 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:24Z","lastTransitionTime":"2026-02-24T02:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.209126 4923 generic.go:334] "Generic (PLEG): container finished" podID="dbaad03e-026e-4046-afab-4d26112ad358" containerID="12b8c69e1d2423fdda429366878ecb866f6bd0e4c5e029a33d3cf958ee3d9eda" exitCode=0 Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.209408 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qxffg" event={"ID":"dbaad03e-026e-4046-afab-4d26112ad358","Type":"ContainerDied","Data":"12b8c69e1d2423fdda429366878ecb866f6bd0e4c5e029a33d3cf958ee3d9eda"} Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.209853 4923 scope.go:117] "RemoveContainer" containerID="ff398e67eccf34b9e0bc1d34f2285b4a275efa099b3a0891887c9ca8e979d39c" Feb 24 02:56:24 crc kubenswrapper[4923]: E0224 02:56:24.210079 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.221540 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.221583 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.221595 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.221612 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.221623 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:24Z","lastTransitionTime":"2026-02-24T02:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.324737 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.324789 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.324801 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.324820 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.324832 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:24Z","lastTransitionTime":"2026-02-24T02:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.426888 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.426923 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.426934 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.426951 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.426962 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:24Z","lastTransitionTime":"2026-02-24T02:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.530559 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.530606 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.530625 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.530647 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.530662 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:24Z","lastTransitionTime":"2026-02-24T02:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.633494 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.633543 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.633558 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.633577 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.633588 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:24Z","lastTransitionTime":"2026-02-24T02:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.691969 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 03:52:11.513087343 +0000 UTC Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.712447 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:24 crc kubenswrapper[4923]: E0224 02:56:24.712606 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.736022 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.736086 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.736110 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.736138 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.736163 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:24Z","lastTransitionTime":"2026-02-24T02:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.839117 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.839172 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.839189 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.839213 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.839230 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:24Z","lastTransitionTime":"2026-02-24T02:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.941343 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.941388 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.941404 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.941427 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:24 crc kubenswrapper[4923]: I0224 02:56:24.941442 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:24Z","lastTransitionTime":"2026-02-24T02:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.044773 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.044838 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.044856 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.044881 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.044900 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:25Z","lastTransitionTime":"2026-02-24T02:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.146593 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.146647 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.146664 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.146686 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.146702 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:25Z","lastTransitionTime":"2026-02-24T02:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.215895 4923 generic.go:334] "Generic (PLEG): container finished" podID="dbaad03e-026e-4046-afab-4d26112ad358" containerID="1e3c6fff29e005a7ffedcd560f9169ad00cb7a884adf2538119fb06263db9f92" exitCode=0 Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.215969 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qxffg" event={"ID":"dbaad03e-026e-4046-afab-4d26112ad358","Type":"ContainerDied","Data":"1e3c6fff29e005a7ffedcd560f9169ad00cb7a884adf2538119fb06263db9f92"} Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.222242 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerStarted","Data":"aa26ad1aff7dedd659bb4fc9764cebae5f1aadd4c2651488e35d7342480a46d8"} Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.249012 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.249046 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.249055 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.249071 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.249080 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:25Z","lastTransitionTime":"2026-02-24T02:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.351684 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.351719 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.351726 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.351741 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.351750 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:25Z","lastTransitionTime":"2026-02-24T02:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.454471 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.454814 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.454823 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.454839 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.454849 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:25Z","lastTransitionTime":"2026-02-24T02:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.557346 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.557385 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.557394 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.557408 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.557417 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:25Z","lastTransitionTime":"2026-02-24T02:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.661172 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.661220 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.661234 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.661253 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.661267 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:25Z","lastTransitionTime":"2026-02-24T02:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.692656 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 20:15:00.110092879 +0000 UTC Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.712780 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.712873 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.712796 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:25 crc kubenswrapper[4923]: E0224 02:56:25.713065 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pl8mp" podUID="fa5ec917-061e-4f9c-8930-994239908f27" Feb 24 02:56:25 crc kubenswrapper[4923]: E0224 02:56:25.713207 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 02:56:25 crc kubenswrapper[4923]: E0224 02:56:25.713491 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.763721 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.763769 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.763787 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.763806 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.763819 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:25Z","lastTransitionTime":"2026-02-24T02:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.867116 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.867155 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.867163 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.867179 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.867188 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:25Z","lastTransitionTime":"2026-02-24T02:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.969392 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.969428 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.969437 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.969451 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:25 crc kubenswrapper[4923]: I0224 02:56:25.969462 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:25Z","lastTransitionTime":"2026-02-24T02:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.072273 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.072395 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.072412 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.072436 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.072460 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:26Z","lastTransitionTime":"2026-02-24T02:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.175413 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.175473 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.175488 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.175510 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.175526 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:26Z","lastTransitionTime":"2026-02-24T02:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.230282 4923 generic.go:334] "Generic (PLEG): container finished" podID="dbaad03e-026e-4046-afab-4d26112ad358" containerID="53ae7d1225b2a09b32904d5102610950875815aa28a2171ffc70a1a7a63e1b19" exitCode=0 Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.230356 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qxffg" event={"ID":"dbaad03e-026e-4046-afab-4d26112ad358","Type":"ContainerDied","Data":"53ae7d1225b2a09b32904d5102610950875815aa28a2171ffc70a1a7a63e1b19"} Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.278472 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.278509 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.278519 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.278533 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.278543 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:26Z","lastTransitionTime":"2026-02-24T02:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.387770 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.387833 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.387852 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.387874 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.387890 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:26Z","lastTransitionTime":"2026-02-24T02:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.490355 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.490702 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.490718 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.490736 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.490750 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:26Z","lastTransitionTime":"2026-02-24T02:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.592909 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.592942 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.592952 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.592967 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.592977 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:26Z","lastTransitionTime":"2026-02-24T02:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.648818 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.648883 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.648895 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.648916 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.648931 4923 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T02:56:26Z","lastTransitionTime":"2026-02-24T02:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.692877 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 08:24:38.457553998 +0000 UTC Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.692963 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.701131 4923 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.702748 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2"] Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.703319 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.704944 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.705479 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.705523 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.708885 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.712833 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:26 crc kubenswrapper[4923]: E0224 02:56:26.713014 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.728651 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.873760 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4855b06-4fa7-4187-9617-7d6d657b7fc0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hzsg2\" (UID: \"e4855b06-4fa7-4187-9617-7d6d657b7fc0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.873806 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e4855b06-4fa7-4187-9617-7d6d657b7fc0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hzsg2\" (UID: \"e4855b06-4fa7-4187-9617-7d6d657b7fc0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.873828 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4855b06-4fa7-4187-9617-7d6d657b7fc0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hzsg2\" (UID: \"e4855b06-4fa7-4187-9617-7d6d657b7fc0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.873845 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4855b06-4fa7-4187-9617-7d6d657b7fc0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hzsg2\" (UID: \"e4855b06-4fa7-4187-9617-7d6d657b7fc0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.873913 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e4855b06-4fa7-4187-9617-7d6d657b7fc0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hzsg2\" (UID: \"e4855b06-4fa7-4187-9617-7d6d657b7fc0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.975596 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e4855b06-4fa7-4187-9617-7d6d657b7fc0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hzsg2\" (UID: \"e4855b06-4fa7-4187-9617-7d6d657b7fc0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.975675 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4855b06-4fa7-4187-9617-7d6d657b7fc0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hzsg2\" (UID: \"e4855b06-4fa7-4187-9617-7d6d657b7fc0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.975735 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4855b06-4fa7-4187-9617-7d6d657b7fc0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hzsg2\" (UID: \"e4855b06-4fa7-4187-9617-7d6d657b7fc0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.975784 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e4855b06-4fa7-4187-9617-7d6d657b7fc0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hzsg2\" (UID: \"e4855b06-4fa7-4187-9617-7d6d657b7fc0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.975810 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e4855b06-4fa7-4187-9617-7d6d657b7fc0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hzsg2\" (UID: \"e4855b06-4fa7-4187-9617-7d6d657b7fc0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.975896 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e4855b06-4fa7-4187-9617-7d6d657b7fc0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hzsg2\" (UID: \"e4855b06-4fa7-4187-9617-7d6d657b7fc0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.975945 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4855b06-4fa7-4187-9617-7d6d657b7fc0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hzsg2\" (UID: \"e4855b06-4fa7-4187-9617-7d6d657b7fc0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.976967 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4855b06-4fa7-4187-9617-7d6d657b7fc0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hzsg2\" (UID: \"e4855b06-4fa7-4187-9617-7d6d657b7fc0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.984164 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4855b06-4fa7-4187-9617-7d6d657b7fc0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hzsg2\" (UID: \"e4855b06-4fa7-4187-9617-7d6d657b7fc0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" Feb 24 02:56:26 crc kubenswrapper[4923]: I0224 02:56:26.997515 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e4855b06-4fa7-4187-9617-7d6d657b7fc0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hzsg2\" (UID: \"e4855b06-4fa7-4187-9617-7d6d657b7fc0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" Feb 24 02:56:27 crc kubenswrapper[4923]: I0224 02:56:27.021721 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" Feb 24 02:56:27 crc kubenswrapper[4923]: W0224 02:56:27.034421 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4855b06_4fa7_4187_9617_7d6d657b7fc0.slice/crio-445ac66b0084f9222ebc5910e7f1e57ffe8a8dbeed87c49bf43b7877e2122824 WatchSource:0}: Error finding container 445ac66b0084f9222ebc5910e7f1e57ffe8a8dbeed87c49bf43b7877e2122824: Status 404 returned error can't find the container with id 445ac66b0084f9222ebc5910e7f1e57ffe8a8dbeed87c49bf43b7877e2122824 Feb 24 02:56:27 crc kubenswrapper[4923]: I0224 02:56:27.235314 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" event={"ID":"e4855b06-4fa7-4187-9617-7d6d657b7fc0","Type":"ContainerStarted","Data":"445ac66b0084f9222ebc5910e7f1e57ffe8a8dbeed87c49bf43b7877e2122824"} Feb 24 02:56:27 crc kubenswrapper[4923]: I0224 02:56:27.240793 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qxffg" event={"ID":"dbaad03e-026e-4046-afab-4d26112ad358","Type":"ContainerStarted","Data":"12894af1b033e6187e134caab985da08b9f3529d532a4630f846963d091b9d00"} Feb 24 02:56:27 crc kubenswrapper[4923]: I0224 02:56:27.263964 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.263945673 podStartE2EDuration="1.263945673s" podCreationTimestamp="2026-02-24 02:56:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:27.263098751 +0000 UTC m=+111.280169604" watchObservedRunningTime="2026-02-24 02:56:27.263945673 +0000 UTC m=+111.281016506" Feb 24 02:56:27 crc kubenswrapper[4923]: I0224 02:56:27.287007 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qxffg" podStartSLOduration=35.28698714 podStartE2EDuration="35.28698714s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:27.286065566 +0000 UTC m=+111.303136399" watchObservedRunningTime="2026-02-24 02:56:27.28698714 +0000 UTC m=+111.304057963" Feb 24 02:56:27 crc kubenswrapper[4923]: I0224 02:56:27.683566 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa5ec917-061e-4f9c-8930-994239908f27-metrics-certs\") pod \"network-metrics-daemon-pl8mp\" (UID: \"fa5ec917-061e-4f9c-8930-994239908f27\") " pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:27 crc kubenswrapper[4923]: E0224 02:56:27.683788 4923 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 02:56:27 crc kubenswrapper[4923]: E0224 02:56:27.683921 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa5ec917-061e-4f9c-8930-994239908f27-metrics-certs podName:fa5ec917-061e-4f9c-8930-994239908f27 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:35.683888067 +0000 UTC m=+119.700958910 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa5ec917-061e-4f9c-8930-994239908f27-metrics-certs") pod "network-metrics-daemon-pl8mp" (UID: "fa5ec917-061e-4f9c-8930-994239908f27") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 02:56:27 crc kubenswrapper[4923]: I0224 02:56:27.712177 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:27 crc kubenswrapper[4923]: I0224 02:56:27.712176 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:27 crc kubenswrapper[4923]: I0224 02:56:27.712284 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:27 crc kubenswrapper[4923]: E0224 02:56:27.713622 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 02:56:27 crc kubenswrapper[4923]: E0224 02:56:27.713707 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pl8mp" podUID="fa5ec917-061e-4f9c-8930-994239908f27" Feb 24 02:56:27 crc kubenswrapper[4923]: E0224 02:56:27.713813 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 02:56:28 crc kubenswrapper[4923]: I0224 02:56:28.244642 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" event={"ID":"e4855b06-4fa7-4187-9617-7d6d657b7fc0","Type":"ContainerStarted","Data":"950d59d9134d3be62b0a15dc9b53c25d635cab7c3285c6064bf224eed730a33a"} Feb 24 02:56:28 crc kubenswrapper[4923]: I0224 02:56:28.249854 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerStarted","Data":"7504f1c88847e620e449f4f4a803c9f641abb29c95bfcb5ff4e88a7c5135f113"} Feb 24 02:56:28 crc kubenswrapper[4923]: I0224 02:56:28.263911 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hzsg2" podStartSLOduration=36.263896481 podStartE2EDuration="36.263896481s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:28.262963886 +0000 UTC m=+112.280034719" watchObservedRunningTime="2026-02-24 02:56:28.263896481 +0000 UTC m=+112.280967294" Feb 24 02:56:28 crc kubenswrapper[4923]: I0224 02:56:28.291684 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" podStartSLOduration=36.291666282 podStartE2EDuration="36.291666282s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:28.291525009 +0000 UTC m=+112.308595842" watchObservedRunningTime="2026-02-24 02:56:28.291666282 +0000 UTC m=+112.308737105" Feb 24 02:56:28 crc kubenswrapper[4923]: I0224 02:56:28.712401 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:28 crc kubenswrapper[4923]: E0224 02:56:28.712602 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 02:56:29 crc kubenswrapper[4923]: I0224 02:56:29.255772 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:29 crc kubenswrapper[4923]: I0224 02:56:29.256096 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:29 crc kubenswrapper[4923]: I0224 02:56:29.256107 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:29 crc kubenswrapper[4923]: I0224 02:56:29.278000 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:29 crc kubenswrapper[4923]: I0224 02:56:29.281283 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:29 crc kubenswrapper[4923]: I0224 02:56:29.712632 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:29 crc kubenswrapper[4923]: I0224 02:56:29.712632 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:29 crc kubenswrapper[4923]: I0224 02:56:29.712752 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:29 crc kubenswrapper[4923]: E0224 02:56:29.712839 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 02:56:29 crc kubenswrapper[4923]: E0224 02:56:29.712952 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 02:56:29 crc kubenswrapper[4923]: E0224 02:56:29.713101 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pl8mp" podUID="fa5ec917-061e-4f9c-8930-994239908f27" Feb 24 02:56:29 crc kubenswrapper[4923]: I0224 02:56:29.717002 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pl8mp"] Feb 24 02:56:30 crc kubenswrapper[4923]: I0224 02:56:30.258372 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:30 crc kubenswrapper[4923]: E0224 02:56:30.258488 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pl8mp" podUID="fa5ec917-061e-4f9c-8930-994239908f27" Feb 24 02:56:30 crc kubenswrapper[4923]: I0224 02:56:30.712944 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:30 crc kubenswrapper[4923]: E0224 02:56:30.713155 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.421243 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:31 crc kubenswrapper[4923]: E0224 02:56:31.421337 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:47.421319847 +0000 UTC m=+131.438390660 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.522623 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:31 crc kubenswrapper[4923]: E0224 02:56:31.522818 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 02:56:31 crc kubenswrapper[4923]: E0224 02:56:31.522962 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 02:56:31 crc kubenswrapper[4923]: E0224 02:56:31.522975 4923 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:31 crc kubenswrapper[4923]: E0224 02:56:31.523022 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:47.523007267 +0000 UTC m=+131.540078080 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:31 crc kubenswrapper[4923]: E0224 02:56:31.523020 4923 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.522942 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:31 crc kubenswrapper[4923]: E0224 02:56:31.523070 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:47.523057728 +0000 UTC m=+131.540128541 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.523085 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.523113 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:31 crc kubenswrapper[4923]: E0224 02:56:31.523159 4923 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 02:56:31 crc kubenswrapper[4923]: E0224 02:56:31.523178 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:47.523172581 +0000 UTC m=+131.540243384 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 02:56:31 crc kubenswrapper[4923]: E0224 02:56:31.523231 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 02:56:31 crc kubenswrapper[4923]: E0224 02:56:31.523243 4923 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 02:56:31 crc kubenswrapper[4923]: E0224 02:56:31.523253 4923 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:31 crc kubenswrapper[4923]: E0224 02:56:31.523273 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:47.523267194 +0000 UTC m=+131.540338007 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.712515 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.712515 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.712777 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:31 crc kubenswrapper[4923]: E0224 02:56:31.712832 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 02:56:31 crc kubenswrapper[4923]: E0224 02:56:31.712634 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 02:56:31 crc kubenswrapper[4923]: E0224 02:56:31.712966 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pl8mp" podUID="fa5ec917-061e-4f9c-8930-994239908f27" Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.927343 4923 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.927517 4923 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.985988 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q2s27"] Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.986766 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pzbwf"] Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.987349 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.987522 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k2q5j"] Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.987757 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.988038 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-q2s27" Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.991431 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-gsv9x"] Feb 24 02:56:31 crc kubenswrapper[4923]: W0224 02:56:31.991851 4923 reflector.go:561] object-"openshift-controller-manager"/"openshift-global-ca": failed to list *v1.ConfigMap: configmaps "openshift-global-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.991889 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cldx7"] Feb 24 02:56:31 crc kubenswrapper[4923]: E0224 02:56:31.991924 4923 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-global-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-global-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.992126 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ll9tx"] Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.992706 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gsv9x" Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.992770 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.993345 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cldx7" Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.994885 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwpbz"] Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.995752 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-w5k6j"] Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.996249 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx"] Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.996675 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwpbz" Feb 24 02:56:31 crc kubenswrapper[4923]: I0224 02:56:31.996765 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.029469 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-82dxq"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.029894 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.034028 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 02:56:32 crc kubenswrapper[4923]: W0224 02:56:32.034148 4923 reflector.go:561] object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c": failed to list *v1.Secret: secrets "openshift-controller-manager-sa-dockercfg-msq4c" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 24 02:56:32 crc kubenswrapper[4923]: E0224 02:56:32.034175 4923 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-msq4c\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-controller-manager-sa-dockercfg-msq4c\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.034231 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: W0224 02:56:32.034367 4923 reflector.go:561] object-"openshift-dns-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns-operator": no relationship found between node 'crc' and this object Feb 24 02:56:32 crc kubenswrapper[4923]: E0224 02:56:32.034382 4923 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.034983 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-82dxq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.036544 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.034884 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ffdsb"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.039744 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.039867 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.039947 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.040025 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.040101 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.040244 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.041140 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lfds7"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.041640 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jdnlf"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.042003 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.042326 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.042490 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.042412 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-clfml"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.042862 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.042908 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.043064 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.043176 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.043404 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.043533 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.043646 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.043752 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.043872 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.044018 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.042831 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ffdsb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.044122 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.044135 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.044216 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.044407 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.044529 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.044636 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.046166 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.046518 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.046659 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.046850 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8rxcd"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.046976 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.047191 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8rxcd" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.047224 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-clfml" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.047703 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.047873 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.048029 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.048104 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.048194 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.048233 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.048422 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.048439 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.048484 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.048444 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.048584 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.048664 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.048707 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.048820 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.049110 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.049746 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.049870 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.050033 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.053070 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.053733 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.054183 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.055617 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.058455 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.058434 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.058791 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.058967 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.058850 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.059219 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.059386 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.059530 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.059649 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.059777 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.059891 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.059827 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cldx7"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.060006 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.059799 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.059849 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.064640 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.065454 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.065667 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.070368 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxmsz"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.071437 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxmsz" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.071447 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q2s27"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.071995 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.072224 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.072415 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.072530 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.073026 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.073234 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.073845 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.076355 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-866rc"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.076992 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pjqpk"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.077564 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pjqpk" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.077783 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.077882 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-866rc" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.090659 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-x278g"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.091714 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x278g" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.092398 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.094011 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.098435 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.101382 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.111872 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.111998 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.112256 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.112620 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.113085 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.113191 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.113269 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.113382 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.113608 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.115019 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-95gv5"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.115600 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-ctnr7"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.115916 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trp2g"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.116238 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trp2g" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.116791 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.116953 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.118730 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.119155 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.120241 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.120711 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.121004 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.123900 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.125444 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.126483 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.127133 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.128145 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.128749 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.129080 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.129319 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.129506 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.131682 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.131955 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.134728 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.135799 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.136773 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.136933 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.137137 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.137780 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.138001 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.139066 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.139534 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.140866 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-74xcz"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.141828 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-74xcz" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.142832 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-console-config\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.142856 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9b5ccce4-6412-43a0-bcea-5f88c4b3b47c-machine-approver-tls\") pod \"machine-approver-56656f9798-w6rtx\" (UID: \"9b5ccce4-6412-43a0-bcea-5f88c4b3b47c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.142877 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v55lc\" (UniqueName: \"kubernetes.io/projected/5401c124-d8da-4335-8cd9-b8afc71fc682-kube-api-access-v55lc\") pod \"cluster-samples-operator-665b6dd947-gwpbz\" (UID: \"5401c124-d8da-4335-8cd9-b8afc71fc682\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwpbz" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.142895 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58b2a3df-daae-45b6-8343-dedec3d3ecce-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.142910 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/348bc8fb-6fa1-40dc-82ca-99683b7e68ed-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8rxcd\" (UID: \"348bc8fb-6fa1-40dc-82ca-99683b7e68ed\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8rxcd" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.142924 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-trusted-ca-bundle\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.142938 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/58b2a3df-daae-45b6-8343-dedec3d3ecce-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.142957 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88764e4a-5a09-4223-b9b5-d576d2b36f41-config\") pod \"openshift-apiserver-operator-796bbdcf4f-cldx7\" (UID: \"88764e4a-5a09-4223-b9b5-d576d2b36f41\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cldx7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.142972 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae174d22-78c6-4699-9d9b-8ce566dc9f4c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-82dxq\" (UID: \"ae174d22-78c6-4699-9d9b-8ce566dc9f4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-82dxq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.142988 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rqd9\" (UniqueName: \"kubernetes.io/projected/c5c02b8b-cae8-4e73-9e6f-34a8120f00c2-kube-api-access-8rqd9\") pod \"downloads-7954f5f757-gsv9x\" (UID: \"c5c02b8b-cae8-4e73-9e6f-34a8120f00c2\") " pod="openshift-console/downloads-7954f5f757-gsv9x" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143005 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/348bc8fb-6fa1-40dc-82ca-99683b7e68ed-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8rxcd\" (UID: \"348bc8fb-6fa1-40dc-82ca-99683b7e68ed\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8rxcd" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143021 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-serving-cert\") pod \"controller-manager-879f6c89f-k2q5j\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143036 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvzrg\" (UniqueName: \"kubernetes.io/projected/9605a72a-cc35-4904-a7f3-bbeff4972542-kube-api-access-gvzrg\") pod \"console-operator-58897d9998-ffdsb\" (UID: \"9605a72a-cc35-4904-a7f3-bbeff4972542\") " pod="openshift-console-operator/console-operator-58897d9998-ffdsb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143054 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143070 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e59578f2-07d5-4eb9-8b58-22a2b4f73a3b-metrics-tls\") pod \"dns-operator-744455d44c-q2s27\" (UID: \"e59578f2-07d5-4eb9-8b58-22a2b4f73a3b\") " pod="openshift-dns-operator/dns-operator-744455d44c-q2s27" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143086 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143104 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143121 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143137 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c1e09aba-671f-45f8-84e1-8a3813b39383-etcd-client\") pod \"etcd-operator-b45778765-pzbwf\" (UID: \"c1e09aba-671f-45f8-84e1-8a3813b39383\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143151 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-k2q5j\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143168 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84drt\" (UniqueName: \"kubernetes.io/projected/58b2a3df-daae-45b6-8343-dedec3d3ecce-kube-api-access-84drt\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143187 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bf70a70a-8a77-428b-97ca-2609ccc84a26-image-import-ca\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143203 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5401c124-d8da-4335-8cd9-b8afc71fc682-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gwpbz\" (UID: \"5401c124-d8da-4335-8cd9-b8afc71fc682\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwpbz" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143220 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143236 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-audit-policies\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143251 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf70a70a-8a77-428b-97ca-2609ccc84a26-audit-dir\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143265 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88764e4a-5a09-4223-b9b5-d576d2b36f41-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-cldx7\" (UID: \"88764e4a-5a09-4223-b9b5-d576d2b36f41\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cldx7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143280 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92ksm\" (UniqueName: \"kubernetes.io/projected/88764e4a-5a09-4223-b9b5-d576d2b36f41-kube-api-access-92ksm\") pod \"openshift-apiserver-operator-796bbdcf4f-cldx7\" (UID: \"88764e4a-5a09-4223-b9b5-d576d2b36f41\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cldx7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143315 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-oauth-serving-cert\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143332 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bf70a70a-8a77-428b-97ca-2609ccc84a26-node-pullsecrets\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143356 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf70a70a-8a77-428b-97ca-2609ccc84a26-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143375 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c1e09aba-671f-45f8-84e1-8a3813b39383-etcd-ca\") pod \"etcd-operator-b45778765-pzbwf\" (UID: \"c1e09aba-671f-45f8-84e1-8a3813b39383\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143392 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9605a72a-cc35-4904-a7f3-bbeff4972542-trusted-ca\") pod \"console-operator-58897d9998-ffdsb\" (UID: \"9605a72a-cc35-4904-a7f3-bbeff4972542\") " pod="openshift-console-operator/console-operator-58897d9998-ffdsb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143422 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143437 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b5ccce4-6412-43a0-bcea-5f88c4b3b47c-config\") pod \"machine-approver-56656f9798-w6rtx\" (UID: \"9b5ccce4-6412-43a0-bcea-5f88c4b3b47c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143453 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzh4v\" (UniqueName: \"kubernetes.io/projected/348bc8fb-6fa1-40dc-82ca-99683b7e68ed-kube-api-access-dzh4v\") pod \"openshift-controller-manager-operator-756b6f6bc6-8rxcd\" (UID: \"348bc8fb-6fa1-40dc-82ca-99683b7e68ed\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8rxcd" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143471 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9605a72a-cc35-4904-a7f3-bbeff4972542-serving-cert\") pod \"console-operator-58897d9998-ffdsb\" (UID: \"9605a72a-cc35-4904-a7f3-bbeff4972542\") " pod="openshift-console-operator/console-operator-58897d9998-ffdsb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143510 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143576 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/58b2a3df-daae-45b6-8343-dedec3d3ecce-encryption-config\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143603 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjkgv\" (UniqueName: \"kubernetes.io/projected/bf70a70a-8a77-428b-97ca-2609ccc84a26-kube-api-access-xjkgv\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143637 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf70a70a-8a77-428b-97ca-2609ccc84a26-config\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143657 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e09aba-671f-45f8-84e1-8a3813b39383-config\") pod \"etcd-operator-b45778765-pzbwf\" (UID: \"c1e09aba-671f-45f8-84e1-8a3813b39383\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143673 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/58b2a3df-daae-45b6-8343-dedec3d3ecce-etcd-client\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143694 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqmdb\" (UniqueName: \"kubernetes.io/projected/ba11e280-cb6f-45fb-8668-80650a1ad7bc-kube-api-access-nqmdb\") pod \"openshift-config-operator-7777fb866f-clfml\" (UID: \"ba11e280-cb6f-45fb-8668-80650a1ad7bc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-clfml" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143721 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-audit-dir\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143741 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58b2a3df-daae-45b6-8343-dedec3d3ecce-audit-policies\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143757 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slqjk\" (UniqueName: \"kubernetes.io/projected/ae174d22-78c6-4699-9d9b-8ce566dc9f4c-kube-api-access-slqjk\") pod \"machine-api-operator-5694c8668f-82dxq\" (UID: \"ae174d22-78c6-4699-9d9b-8ce566dc9f4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-82dxq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143782 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52bd6986-2737-455e-bb5c-570aa29f4001-config\") pod \"authentication-operator-69f744f599-jdnlf\" (UID: \"52bd6986-2737-455e-bb5c-570aa29f4001\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143798 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bf70a70a-8a77-428b-97ca-2609ccc84a26-audit\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143823 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s25w4\" (UniqueName: \"kubernetes.io/projected/c1e09aba-671f-45f8-84e1-8a3813b39383-kube-api-access-s25w4\") pod \"etcd-operator-b45778765-pzbwf\" (UID: \"c1e09aba-671f-45f8-84e1-8a3813b39383\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143840 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba11e280-cb6f-45fb-8668-80650a1ad7bc-serving-cert\") pod \"openshift-config-operator-7777fb866f-clfml\" (UID: \"ba11e280-cb6f-45fb-8668-80650a1ad7bc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-clfml" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143856 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd6ph\" (UniqueName: \"kubernetes.io/projected/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-kube-api-access-wd6ph\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143883 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtl7x\" (UniqueName: \"kubernetes.io/projected/52bd6986-2737-455e-bb5c-570aa29f4001-kube-api-access-rtl7x\") pod \"authentication-operator-69f744f599-jdnlf\" (UID: \"52bd6986-2737-455e-bb5c-570aa29f4001\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143901 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddqcm\" (UniqueName: \"kubernetes.io/projected/e59578f2-07d5-4eb9-8b58-22a2b4f73a3b-kube-api-access-ddqcm\") pod \"dns-operator-744455d44c-q2s27\" (UID: \"e59578f2-07d5-4eb9-8b58-22a2b4f73a3b\") " pod="openshift-dns-operator/dns-operator-744455d44c-q2s27" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143919 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9605a72a-cc35-4904-a7f3-bbeff4972542-config\") pod \"console-operator-58897d9998-ffdsb\" (UID: \"9605a72a-cc35-4904-a7f3-bbeff4972542\") " pod="openshift-console-operator/console-operator-58897d9998-ffdsb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143935 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ba11e280-cb6f-45fb-8668-80650a1ad7bc-available-featuregates\") pod \"openshift-config-operator-7777fb866f-clfml\" (UID: \"ba11e280-cb6f-45fb-8668-80650a1ad7bc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-clfml" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143952 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c1e09aba-671f-45f8-84e1-8a3813b39383-etcd-service-ca\") pod \"etcd-operator-b45778765-pzbwf\" (UID: \"c1e09aba-671f-45f8-84e1-8a3813b39383\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.143986 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdtgs\" (UniqueName: \"kubernetes.io/projected/3f89d640-5e7f-473b-98e3-420780c10024-kube-api-access-cdtgs\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144004 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f89d640-5e7f-473b-98e3-420780c10024-console-oauth-config\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144054 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58b2a3df-daae-45b6-8343-dedec3d3ecce-serving-cert\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144074 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae174d22-78c6-4699-9d9b-8ce566dc9f4c-config\") pod \"machine-api-operator-5694c8668f-82dxq\" (UID: \"ae174d22-78c6-4699-9d9b-8ce566dc9f4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-82dxq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144097 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1e09aba-671f-45f8-84e1-8a3813b39383-serving-cert\") pod \"etcd-operator-b45778765-pzbwf\" (UID: \"c1e09aba-671f-45f8-84e1-8a3813b39383\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144117 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-client-ca\") pod \"controller-manager-879f6c89f-k2q5j\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144156 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bf70a70a-8a77-428b-97ca-2609ccc84a26-etcd-serving-ca\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144176 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-config\") pod \"controller-manager-879f6c89f-k2q5j\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144192 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144211 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144227 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ae174d22-78c6-4699-9d9b-8ce566dc9f4c-images\") pod \"machine-api-operator-5694c8668f-82dxq\" (UID: \"ae174d22-78c6-4699-9d9b-8ce566dc9f4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-82dxq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144257 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52bd6986-2737-455e-bb5c-570aa29f4001-serving-cert\") pod \"authentication-operator-69f744f599-jdnlf\" (UID: \"52bd6986-2737-455e-bb5c-570aa29f4001\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144275 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9b5ccce4-6412-43a0-bcea-5f88c4b3b47c-auth-proxy-config\") pod \"machine-approver-56656f9798-w6rtx\" (UID: \"9b5ccce4-6412-43a0-bcea-5f88c4b3b47c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144310 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bf70a70a-8a77-428b-97ca-2609ccc84a26-encryption-config\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144337 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144355 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144401 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-service-ca\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144419 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t7jh\" (UniqueName: \"kubernetes.io/projected/9b5ccce4-6412-43a0-bcea-5f88c4b3b47c-kube-api-access-2t7jh\") pod \"machine-approver-56656f9798-w6rtx\" (UID: \"9b5ccce4-6412-43a0-bcea-5f88c4b3b47c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144569 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f89d640-5e7f-473b-98e3-420780c10024-console-serving-cert\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144624 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52bd6986-2737-455e-bb5c-570aa29f4001-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jdnlf\" (UID: \"52bd6986-2737-455e-bb5c-570aa29f4001\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144655 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52bd6986-2737-455e-bb5c-570aa29f4001-service-ca-bundle\") pod \"authentication-operator-69f744f599-jdnlf\" (UID: \"52bd6986-2737-455e-bb5c-570aa29f4001\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144681 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58b2a3df-daae-45b6-8343-dedec3d3ecce-audit-dir\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144696 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bf70a70a-8a77-428b-97ca-2609ccc84a26-etcd-client\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144718 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf70a70a-8a77-428b-97ca-2609ccc84a26-serving-cert\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144745 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9nhg\" (UniqueName: \"kubernetes.io/projected/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-kube-api-access-p9nhg\") pod \"controller-manager-879f6c89f-k2q5j\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.144742 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.145256 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.155985 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hkwfz"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.156792 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kmr2g"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.157136 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.157566 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.159243 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hkwfz" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.162079 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vvpzd"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.164184 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vvpzd" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.166516 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kmr2g" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.170548 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.172489 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8rxcd"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.174050 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.177821 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gsv9x"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.181413 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7qxgw"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.183348 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7qxgw" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.184491 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ll9tx"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.192862 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.198247 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwpbz"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.202695 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.223859 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rx89l"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.224440 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-292s5"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.224913 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.224989 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-w5k6j"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.225172 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rx89l" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.228743 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.228962 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k2q5j"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.229018 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxmsz"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.229182 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-zswn9"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.230120 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.231812 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.231945 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-96lgm"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.232728 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lfds7"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.232818 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.234066 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-866rc"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.234456 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jdnlf"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.235939 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.239056 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.239096 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pzbwf"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.242591 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.243875 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ffdsb"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.244771 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-95gv5"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245196 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bf70a70a-8a77-428b-97ca-2609ccc84a26-etcd-serving-ca\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245221 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-config\") pod \"controller-manager-879f6c89f-k2q5j\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245240 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245258 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ae174d22-78c6-4699-9d9b-8ce566dc9f4c-images\") pod \"machine-api-operator-5694c8668f-82dxq\" (UID: \"ae174d22-78c6-4699-9d9b-8ce566dc9f4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-82dxq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245273 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245289 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52bd6986-2737-455e-bb5c-570aa29f4001-serving-cert\") pod \"authentication-operator-69f744f599-jdnlf\" (UID: \"52bd6986-2737-455e-bb5c-570aa29f4001\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245334 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9b5ccce4-6412-43a0-bcea-5f88c4b3b47c-auth-proxy-config\") pod \"machine-approver-56656f9798-w6rtx\" (UID: \"9b5ccce4-6412-43a0-bcea-5f88c4b3b47c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245349 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bf70a70a-8a77-428b-97ca-2609ccc84a26-encryption-config\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245364 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245380 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245399 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcw75\" (UniqueName: \"kubernetes.io/projected/8ae95859-0a1a-4095-89a8-77ad9197a9e9-kube-api-access-wcw75\") pod \"kube-storage-version-migrator-operator-b67b599dd-trp2g\" (UID: \"8ae95859-0a1a-4095-89a8-77ad9197a9e9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trp2g" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245424 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-service-ca\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245439 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t7jh\" (UniqueName: \"kubernetes.io/projected/9b5ccce4-6412-43a0-bcea-5f88c4b3b47c-kube-api-access-2t7jh\") pod \"machine-approver-56656f9798-w6rtx\" (UID: \"9b5ccce4-6412-43a0-bcea-5f88c4b3b47c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245455 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ae95859-0a1a-4095-89a8-77ad9197a9e9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-trp2g\" (UID: \"8ae95859-0a1a-4095-89a8-77ad9197a9e9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trp2g" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245470 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d69a94d1-60bd-4bd5-90ca-7d7cd50438b6-config\") pod \"kube-apiserver-operator-766d6c64bb-xxmsz\" (UID: \"d69a94d1-60bd-4bd5-90ca-7d7cd50438b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxmsz" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245494 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f89d640-5e7f-473b-98e3-420780c10024-console-serving-cert\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245511 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52bd6986-2737-455e-bb5c-570aa29f4001-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jdnlf\" (UID: \"52bd6986-2737-455e-bb5c-570aa29f4001\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245535 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58b2a3df-daae-45b6-8343-dedec3d3ecce-audit-dir\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245549 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bf70a70a-8a77-428b-97ca-2609ccc84a26-etcd-client\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245566 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf70a70a-8a77-428b-97ca-2609ccc84a26-serving-cert\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245581 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9nhg\" (UniqueName: \"kubernetes.io/projected/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-kube-api-access-p9nhg\") pod \"controller-manager-879f6c89f-k2q5j\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245599 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6-trusted-ca\") pod \"ingress-operator-5b745b69d9-gvr8j\" (UID: \"60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245614 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52bd6986-2737-455e-bb5c-570aa29f4001-service-ca-bundle\") pod \"authentication-operator-69f744f599-jdnlf\" (UID: \"52bd6986-2737-455e-bb5c-570aa29f4001\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245632 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-console-config\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245649 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9b5ccce4-6412-43a0-bcea-5f88c4b3b47c-machine-approver-tls\") pod \"machine-approver-56656f9798-w6rtx\" (UID: \"9b5ccce4-6412-43a0-bcea-5f88c4b3b47c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245667 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v55lc\" (UniqueName: \"kubernetes.io/projected/5401c124-d8da-4335-8cd9-b8afc71fc682-kube-api-access-v55lc\") pod \"cluster-samples-operator-665b6dd947-gwpbz\" (UID: \"5401c124-d8da-4335-8cd9-b8afc71fc682\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwpbz" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245684 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82z5g\" (UniqueName: \"kubernetes.io/projected/99dcd7e1-2d1d-4d12-a279-02c56e1d96c6-kube-api-access-82z5g\") pod \"cluster-image-registry-operator-dc59b4c8b-bgtwq\" (UID: \"99dcd7e1-2d1d-4d12-a279-02c56e1d96c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245700 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c32d4cb-1966-42fd-ba29-1bda2bcced93-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-866rc\" (UID: \"6c32d4cb-1966-42fd-ba29-1bda2bcced93\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-866rc" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245717 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58b2a3df-daae-45b6-8343-dedec3d3ecce-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245733 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/348bc8fb-6fa1-40dc-82ca-99683b7e68ed-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8rxcd\" (UID: \"348bc8fb-6fa1-40dc-82ca-99683b7e68ed\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8rxcd" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245748 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gvr8j\" (UID: \"60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245765 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-trusted-ca-bundle\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245780 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/58b2a3df-daae-45b6-8343-dedec3d3ecce-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245800 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae174d22-78c6-4699-9d9b-8ce566dc9f4c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-82dxq\" (UID: \"ae174d22-78c6-4699-9d9b-8ce566dc9f4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-82dxq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245815 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rqd9\" (UniqueName: \"kubernetes.io/projected/c5c02b8b-cae8-4e73-9e6f-34a8120f00c2-kube-api-access-8rqd9\") pod \"downloads-7954f5f757-gsv9x\" (UID: \"c5c02b8b-cae8-4e73-9e6f-34a8120f00c2\") " pod="openshift-console/downloads-7954f5f757-gsv9x" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245831 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c32d4cb-1966-42fd-ba29-1bda2bcced93-config\") pod \"kube-controller-manager-operator-78b949d7b-866rc\" (UID: \"6c32d4cb-1966-42fd-ba29-1bda2bcced93\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-866rc" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245847 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88764e4a-5a09-4223-b9b5-d576d2b36f41-config\") pod \"openshift-apiserver-operator-796bbdcf4f-cldx7\" (UID: \"88764e4a-5a09-4223-b9b5-d576d2b36f41\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cldx7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245862 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/99dcd7e1-2d1d-4d12-a279-02c56e1d96c6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bgtwq\" (UID: \"99dcd7e1-2d1d-4d12-a279-02c56e1d96c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245878 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/348bc8fb-6fa1-40dc-82ca-99683b7e68ed-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8rxcd\" (UID: \"348bc8fb-6fa1-40dc-82ca-99683b7e68ed\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8rxcd" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245921 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-serving-cert\") pod \"controller-manager-879f6c89f-k2q5j\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245949 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvzrg\" (UniqueName: \"kubernetes.io/projected/9605a72a-cc35-4904-a7f3-bbeff4972542-kube-api-access-gvzrg\") pod \"console-operator-58897d9998-ffdsb\" (UID: \"9605a72a-cc35-4904-a7f3-bbeff4972542\") " pod="openshift-console-operator/console-operator-58897d9998-ffdsb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.245966 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99dcd7e1-2d1d-4d12-a279-02c56e1d96c6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bgtwq\" (UID: \"99dcd7e1-2d1d-4d12-a279-02c56e1d96c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246006 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246022 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e59578f2-07d5-4eb9-8b58-22a2b4f73a3b-metrics-tls\") pod \"dns-operator-744455d44c-q2s27\" (UID: \"e59578f2-07d5-4eb9-8b58-22a2b4f73a3b\") " pod="openshift-dns-operator/dns-operator-744455d44c-q2s27" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246038 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246055 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246078 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246095 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84drt\" (UniqueName: \"kubernetes.io/projected/58b2a3df-daae-45b6-8343-dedec3d3ecce-kube-api-access-84drt\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246110 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bf70a70a-8a77-428b-97ca-2609ccc84a26-image-import-ca\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246127 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c1e09aba-671f-45f8-84e1-8a3813b39383-etcd-client\") pod \"etcd-operator-b45778765-pzbwf\" (UID: \"c1e09aba-671f-45f8-84e1-8a3813b39383\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246144 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-k2q5j\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246163 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a99e06cc-b200-4073-a847-410f9799eb3a-secret-volume\") pod \"collect-profiles-29531685-dg9vc\" (UID: \"a99e06cc-b200-4073-a847-410f9799eb3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246181 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5401c124-d8da-4335-8cd9-b8afc71fc682-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gwpbz\" (UID: \"5401c124-d8da-4335-8cd9-b8afc71fc682\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwpbz" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246198 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246215 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-audit-policies\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246230 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf70a70a-8a77-428b-97ca-2609ccc84a26-audit-dir\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246246 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88764e4a-5a09-4223-b9b5-d576d2b36f41-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-cldx7\" (UID: \"88764e4a-5a09-4223-b9b5-d576d2b36f41\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cldx7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246263 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92ksm\" (UniqueName: \"kubernetes.io/projected/88764e4a-5a09-4223-b9b5-d576d2b36f41-kube-api-access-92ksm\") pod \"openshift-apiserver-operator-796bbdcf4f-cldx7\" (UID: \"88764e4a-5a09-4223-b9b5-d576d2b36f41\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cldx7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246280 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c32d4cb-1966-42fd-ba29-1bda2bcced93-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-866rc\" (UID: \"6c32d4cb-1966-42fd-ba29-1bda2bcced93\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-866rc" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246315 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-oauth-serving-cert\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246331 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf70a70a-8a77-428b-97ca-2609ccc84a26-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246346 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c1e09aba-671f-45f8-84e1-8a3813b39383-etcd-ca\") pod \"etcd-operator-b45778765-pzbwf\" (UID: \"c1e09aba-671f-45f8-84e1-8a3813b39383\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246361 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9605a72a-cc35-4904-a7f3-bbeff4972542-trusted-ca\") pod \"console-operator-58897d9998-ffdsb\" (UID: \"9605a72a-cc35-4904-a7f3-bbeff4972542\") " pod="openshift-console-operator/console-operator-58897d9998-ffdsb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246376 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bf70a70a-8a77-428b-97ca-2609ccc84a26-node-pullsecrets\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246398 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246414 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b5ccce4-6412-43a0-bcea-5f88c4b3b47c-config\") pod \"machine-approver-56656f9798-w6rtx\" (UID: \"9b5ccce4-6412-43a0-bcea-5f88c4b3b47c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246429 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzh4v\" (UniqueName: \"kubernetes.io/projected/348bc8fb-6fa1-40dc-82ca-99683b7e68ed-kube-api-access-dzh4v\") pod \"openshift-controller-manager-operator-756b6f6bc6-8rxcd\" (UID: \"348bc8fb-6fa1-40dc-82ca-99683b7e68ed\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8rxcd" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246445 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9605a72a-cc35-4904-a7f3-bbeff4972542-serving-cert\") pod \"console-operator-58897d9998-ffdsb\" (UID: \"9605a72a-cc35-4904-a7f3-bbeff4972542\") " pod="openshift-console-operator/console-operator-58897d9998-ffdsb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246462 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246478 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/58b2a3df-daae-45b6-8343-dedec3d3ecce-encryption-config\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246493 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjkgv\" (UniqueName: \"kubernetes.io/projected/bf70a70a-8a77-428b-97ca-2609ccc84a26-kube-api-access-xjkgv\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246510 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9l7g\" (UniqueName: \"kubernetes.io/projected/60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6-kube-api-access-l9l7g\") pod \"ingress-operator-5b745b69d9-gvr8j\" (UID: \"60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246528 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf70a70a-8a77-428b-97ca-2609ccc84a26-config\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246545 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e09aba-671f-45f8-84e1-8a3813b39383-config\") pod \"etcd-operator-b45778765-pzbwf\" (UID: \"c1e09aba-671f-45f8-84e1-8a3813b39383\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246561 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-audit-dir\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246576 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58b2a3df-daae-45b6-8343-dedec3d3ecce-audit-policies\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246590 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/58b2a3df-daae-45b6-8343-dedec3d3ecce-etcd-client\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246606 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqmdb\" (UniqueName: \"kubernetes.io/projected/ba11e280-cb6f-45fb-8668-80650a1ad7bc-kube-api-access-nqmdb\") pod \"openshift-config-operator-7777fb866f-clfml\" (UID: \"ba11e280-cb6f-45fb-8668-80650a1ad7bc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-clfml" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246621 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99dcd7e1-2d1d-4d12-a279-02c56e1d96c6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bgtwq\" (UID: \"99dcd7e1-2d1d-4d12-a279-02c56e1d96c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246637 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6-metrics-tls\") pod \"ingress-operator-5b745b69d9-gvr8j\" (UID: \"60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246654 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slqjk\" (UniqueName: \"kubernetes.io/projected/ae174d22-78c6-4699-9d9b-8ce566dc9f4c-kube-api-access-slqjk\") pod \"machine-api-operator-5694c8668f-82dxq\" (UID: \"ae174d22-78c6-4699-9d9b-8ce566dc9f4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-82dxq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246669 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52bd6986-2737-455e-bb5c-570aa29f4001-config\") pod \"authentication-operator-69f744f599-jdnlf\" (UID: \"52bd6986-2737-455e-bb5c-570aa29f4001\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246682 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bf70a70a-8a77-428b-97ca-2609ccc84a26-audit\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246696 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s25w4\" (UniqueName: \"kubernetes.io/projected/c1e09aba-671f-45f8-84e1-8a3813b39383-kube-api-access-s25w4\") pod \"etcd-operator-b45778765-pzbwf\" (UID: \"c1e09aba-671f-45f8-84e1-8a3813b39383\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246713 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd6ph\" (UniqueName: \"kubernetes.io/projected/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-kube-api-access-wd6ph\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246728 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtl7x\" (UniqueName: \"kubernetes.io/projected/52bd6986-2737-455e-bb5c-570aa29f4001-kube-api-access-rtl7x\") pod \"authentication-operator-69f744f599-jdnlf\" (UID: \"52bd6986-2737-455e-bb5c-570aa29f4001\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246742 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddqcm\" (UniqueName: \"kubernetes.io/projected/e59578f2-07d5-4eb9-8b58-22a2b4f73a3b-kube-api-access-ddqcm\") pod \"dns-operator-744455d44c-q2s27\" (UID: \"e59578f2-07d5-4eb9-8b58-22a2b4f73a3b\") " pod="openshift-dns-operator/dns-operator-744455d44c-q2s27" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246758 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba11e280-cb6f-45fb-8668-80650a1ad7bc-serving-cert\") pod \"openshift-config-operator-7777fb866f-clfml\" (UID: \"ba11e280-cb6f-45fb-8668-80650a1ad7bc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-clfml" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246773 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae95859-0a1a-4095-89a8-77ad9197a9e9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-trp2g\" (UID: \"8ae95859-0a1a-4095-89a8-77ad9197a9e9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trp2g" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246790 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46qbb\" (UniqueName: \"kubernetes.io/projected/a99e06cc-b200-4073-a847-410f9799eb3a-kube-api-access-46qbb\") pod \"collect-profiles-29531685-dg9vc\" (UID: \"a99e06cc-b200-4073-a847-410f9799eb3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246808 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9605a72a-cc35-4904-a7f3-bbeff4972542-config\") pod \"console-operator-58897d9998-ffdsb\" (UID: \"9605a72a-cc35-4904-a7f3-bbeff4972542\") " pod="openshift-console-operator/console-operator-58897d9998-ffdsb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246826 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ba11e280-cb6f-45fb-8668-80650a1ad7bc-available-featuregates\") pod \"openshift-config-operator-7777fb866f-clfml\" (UID: \"ba11e280-cb6f-45fb-8668-80650a1ad7bc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-clfml" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246841 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c1e09aba-671f-45f8-84e1-8a3813b39383-etcd-service-ca\") pod \"etcd-operator-b45778765-pzbwf\" (UID: \"c1e09aba-671f-45f8-84e1-8a3813b39383\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246859 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdtgs\" (UniqueName: \"kubernetes.io/projected/3f89d640-5e7f-473b-98e3-420780c10024-kube-api-access-cdtgs\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246874 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a99e06cc-b200-4073-a847-410f9799eb3a-config-volume\") pod \"collect-profiles-29531685-dg9vc\" (UID: \"a99e06cc-b200-4073-a847-410f9799eb3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246891 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f89d640-5e7f-473b-98e3-420780c10024-console-oauth-config\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246908 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58b2a3df-daae-45b6-8343-dedec3d3ecce-serving-cert\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246924 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae174d22-78c6-4699-9d9b-8ce566dc9f4c-config\") pod \"machine-api-operator-5694c8668f-82dxq\" (UID: \"ae174d22-78c6-4699-9d9b-8ce566dc9f4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-82dxq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246939 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1e09aba-671f-45f8-84e1-8a3813b39383-serving-cert\") pod \"etcd-operator-b45778765-pzbwf\" (UID: \"c1e09aba-671f-45f8-84e1-8a3813b39383\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246953 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-client-ca\") pod \"controller-manager-879f6c89f-k2q5j\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246968 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d69a94d1-60bd-4bd5-90ca-7d7cd50438b6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xxmsz\" (UID: \"d69a94d1-60bd-4bd5-90ca-7d7cd50438b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxmsz" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246982 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d69a94d1-60bd-4bd5-90ca-7d7cd50438b6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xxmsz\" (UID: \"d69a94d1-60bd-4bd5-90ca-7d7cd50438b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxmsz" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.247076 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.247350 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58b2a3df-daae-45b6-8343-dedec3d3ecce-audit-dir\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.247772 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52bd6986-2737-455e-bb5c-570aa29f4001-service-ca-bundle\") pod \"authentication-operator-69f744f599-jdnlf\" (UID: \"52bd6986-2737-455e-bb5c-570aa29f4001\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.247980 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf70a70a-8a77-428b-97ca-2609ccc84a26-config\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.248087 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/58b2a3df-daae-45b6-8343-dedec3d3ecce-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.248346 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52bd6986-2737-455e-bb5c-570aa29f4001-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jdnlf\" (UID: \"52bd6986-2737-455e-bb5c-570aa29f4001\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.248413 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trp2g"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.248648 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-console-config\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.250711 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.251250 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bf70a70a-8a77-428b-97ca-2609ccc84a26-etcd-serving-ca\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.252130 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/348bc8fb-6fa1-40dc-82ca-99683b7e68ed-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8rxcd\" (UID: \"348bc8fb-6fa1-40dc-82ca-99683b7e68ed\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8rxcd" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.252903 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9b5ccce4-6412-43a0-bcea-5f88c4b3b47c-machine-approver-tls\") pod \"machine-approver-56656f9798-w6rtx\" (UID: \"9b5ccce4-6412-43a0-bcea-5f88c4b3b47c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.252911 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.252937 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-82dxq"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.252947 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-clfml"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.253162 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-config\") pod \"controller-manager-879f6c89f-k2q5j\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.253595 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58b2a3df-daae-45b6-8343-dedec3d3ecce-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.254122 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88764e4a-5a09-4223-b9b5-d576d2b36f41-config\") pod \"openshift-apiserver-operator-796bbdcf4f-cldx7\" (UID: \"88764e4a-5a09-4223-b9b5-d576d2b36f41\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cldx7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.254581 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9605a72a-cc35-4904-a7f3-bbeff4972542-config\") pod \"console-operator-58897d9998-ffdsb\" (UID: \"9605a72a-cc35-4904-a7f3-bbeff4972542\") " pod="openshift-console-operator/console-operator-58897d9998-ffdsb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.246805 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-service-ca\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.254932 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ba11e280-cb6f-45fb-8668-80650a1ad7bc-available-featuregates\") pod \"openshift-config-operator-7777fb866f-clfml\" (UID: \"ba11e280-cb6f-45fb-8668-80650a1ad7bc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-clfml" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.254955 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.255149 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-audit-dir\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.255273 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae174d22-78c6-4699-9d9b-8ce566dc9f4c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-82dxq\" (UID: \"ae174d22-78c6-4699-9d9b-8ce566dc9f4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-82dxq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.255441 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c1e09aba-671f-45f8-84e1-8a3813b39383-etcd-service-ca\") pod \"etcd-operator-b45778765-pzbwf\" (UID: \"c1e09aba-671f-45f8-84e1-8a3813b39383\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.255697 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58b2a3df-daae-45b6-8343-dedec3d3ecce-audit-policies\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.255809 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7qxgw"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.255817 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f89d640-5e7f-473b-98e3-420780c10024-console-serving-cert\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.256283 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52bd6986-2737-455e-bb5c-570aa29f4001-config\") pod \"authentication-operator-69f744f599-jdnlf\" (UID: \"52bd6986-2737-455e-bb5c-570aa29f4001\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.256337 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b5ccce4-6412-43a0-bcea-5f88c4b3b47c-config\") pod \"machine-approver-56656f9798-w6rtx\" (UID: \"9b5ccce4-6412-43a0-bcea-5f88c4b3b47c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.256841 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-oauth-serving-cert\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.257523 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.257559 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bf70a70a-8a77-428b-97ca-2609ccc84a26-audit\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.257683 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e59578f2-07d5-4eb9-8b58-22a2b4f73a3b-metrics-tls\") pod \"dns-operator-744455d44c-q2s27\" (UID: \"e59578f2-07d5-4eb9-8b58-22a2b4f73a3b\") " pod="openshift-dns-operator/dns-operator-744455d44c-q2s27" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.257730 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ae174d22-78c6-4699-9d9b-8ce566dc9f4c-images\") pod \"machine-api-operator-5694c8668f-82dxq\" (UID: \"ae174d22-78c6-4699-9d9b-8ce566dc9f4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-82dxq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.258213 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f89d640-5e7f-473b-98e3-420780c10024-console-oauth-config\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.258841 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-audit-policies\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.258869 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf70a70a-8a77-428b-97ca-2609ccc84a26-audit-dir\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.259044 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf70a70a-8a77-428b-97ca-2609ccc84a26-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.259088 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bf70a70a-8a77-428b-97ca-2609ccc84a26-image-import-ca\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.259333 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-trusted-ca-bundle\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.259560 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9b5ccce4-6412-43a0-bcea-5f88c4b3b47c-auth-proxy-config\") pod \"machine-approver-56656f9798-w6rtx\" (UID: \"9b5ccce4-6412-43a0-bcea-5f88c4b3b47c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.259699 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e09aba-671f-45f8-84e1-8a3813b39383-config\") pod \"etcd-operator-b45778765-pzbwf\" (UID: \"c1e09aba-671f-45f8-84e1-8a3813b39383\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.260272 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae174d22-78c6-4699-9d9b-8ce566dc9f4c-config\") pod \"machine-api-operator-5694c8668f-82dxq\" (UID: \"ae174d22-78c6-4699-9d9b-8ce566dc9f4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-82dxq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.260351 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bf70a70a-8a77-428b-97ca-2609ccc84a26-node-pullsecrets\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.260486 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9605a72a-cc35-4904-a7f3-bbeff4972542-trusted-ca\") pod \"console-operator-58897d9998-ffdsb\" (UID: \"9605a72a-cc35-4904-a7f3-bbeff4972542\") " pod="openshift-console-operator/console-operator-58897d9998-ffdsb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.261015 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-client-ca\") pod \"controller-manager-879f6c89f-k2q5j\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.261044 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c1e09aba-671f-45f8-84e1-8a3813b39383-etcd-ca\") pod \"etcd-operator-b45778765-pzbwf\" (UID: \"c1e09aba-671f-45f8-84e1-8a3813b39383\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.261170 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.261618 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c1e09aba-671f-45f8-84e1-8a3813b39383-etcd-client\") pod \"etcd-operator-b45778765-pzbwf\" (UID: \"c1e09aba-671f-45f8-84e1-8a3813b39383\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.262171 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kmr2g"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.262271 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.262307 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bf70a70a-8a77-428b-97ca-2609ccc84a26-encryption-config\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.262455 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.262724 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/52bd6986-2737-455e-bb5c-570aa29f4001-serving-cert\") pod \"authentication-operator-69f744f599-jdnlf\" (UID: \"52bd6986-2737-455e-bb5c-570aa29f4001\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.263138 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/58b2a3df-daae-45b6-8343-dedec3d3ecce-encryption-config\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.263395 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5401c124-d8da-4335-8cd9-b8afc71fc682-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gwpbz\" (UID: \"5401c124-d8da-4335-8cd9-b8afc71fc682\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwpbz" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.263763 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1e09aba-671f-45f8-84e1-8a3813b39383-serving-cert\") pod \"etcd-operator-b45778765-pzbwf\" (UID: \"c1e09aba-671f-45f8-84e1-8a3813b39383\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.263924 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.264275 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pjqpk"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.264998 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9605a72a-cc35-4904-a7f3-bbeff4972542-serving-cert\") pod \"console-operator-58897d9998-ffdsb\" (UID: \"9605a72a-cc35-4904-a7f3-bbeff4972542\") " pod="openshift-console-operator/console-operator-58897d9998-ffdsb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.262415 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58b2a3df-daae-45b6-8343-dedec3d3ecce-serving-cert\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.265101 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/58b2a3df-daae-45b6-8343-dedec3d3ecce-etcd-client\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.265429 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.265569 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88764e4a-5a09-4223-b9b5-d576d2b36f41-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-cldx7\" (UID: \"88764e4a-5a09-4223-b9b5-d576d2b36f41\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cldx7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.265737 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.265867 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bf70a70a-8a77-428b-97ca-2609ccc84a26-etcd-client\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.266049 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.266324 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf70a70a-8a77-428b-97ca-2609ccc84a26-serving-cert\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.266384 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba11e280-cb6f-45fb-8668-80650a1ad7bc-serving-cert\") pod \"openshift-config-operator-7777fb866f-clfml\" (UID: \"ba11e280-cb6f-45fb-8668-80650a1ad7bc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-clfml" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.266742 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-serving-cert\") pod \"controller-manager-879f6c89f-k2q5j\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.267417 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.269319 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.270890 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/348bc8fb-6fa1-40dc-82ca-99683b7e68ed-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8rxcd\" (UID: \"348bc8fb-6fa1-40dc-82ca-99683b7e68ed\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8rxcd" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.274278 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-x278g"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.277423 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.277730 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vvpzd"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.279318 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hkwfz"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.280193 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.281093 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-74xcz"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.282043 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.283772 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.286542 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9jk4q"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.288177 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-v6g8h"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.290892 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9jk4q" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.292582 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-292s5"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.292612 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-96lgm"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.292750 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v6g8h" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.293457 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9jk4q"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.294922 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.297700 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.302846 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v6g8h"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.302993 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rx89l"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.303542 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-qgk6z"] Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.304370 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qgk6z" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.309874 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.330188 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.347719 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99dcd7e1-2d1d-4d12-a279-02c56e1d96c6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bgtwq\" (UID: \"99dcd7e1-2d1d-4d12-a279-02c56e1d96c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.347778 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6-metrics-tls\") pod \"ingress-operator-5b745b69d9-gvr8j\" (UID: \"60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.347833 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae95859-0a1a-4095-89a8-77ad9197a9e9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-trp2g\" (UID: \"8ae95859-0a1a-4095-89a8-77ad9197a9e9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trp2g" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.347874 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46qbb\" (UniqueName: \"kubernetes.io/projected/a99e06cc-b200-4073-a847-410f9799eb3a-kube-api-access-46qbb\") pod \"collect-profiles-29531685-dg9vc\" (UID: \"a99e06cc-b200-4073-a847-410f9799eb3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.347898 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a99e06cc-b200-4073-a847-410f9799eb3a-config-volume\") pod \"collect-profiles-29531685-dg9vc\" (UID: \"a99e06cc-b200-4073-a847-410f9799eb3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.347936 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d69a94d1-60bd-4bd5-90ca-7d7cd50438b6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xxmsz\" (UID: \"d69a94d1-60bd-4bd5-90ca-7d7cd50438b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxmsz" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.347962 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d69a94d1-60bd-4bd5-90ca-7d7cd50438b6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xxmsz\" (UID: \"d69a94d1-60bd-4bd5-90ca-7d7cd50438b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxmsz" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.348001 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcw75\" (UniqueName: \"kubernetes.io/projected/8ae95859-0a1a-4095-89a8-77ad9197a9e9-kube-api-access-wcw75\") pod \"kube-storage-version-migrator-operator-b67b599dd-trp2g\" (UID: \"8ae95859-0a1a-4095-89a8-77ad9197a9e9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trp2g" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.348049 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ae95859-0a1a-4095-89a8-77ad9197a9e9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-trp2g\" (UID: \"8ae95859-0a1a-4095-89a8-77ad9197a9e9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trp2g" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.348101 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d69a94d1-60bd-4bd5-90ca-7d7cd50438b6-config\") pod \"kube-apiserver-operator-766d6c64bb-xxmsz\" (UID: \"d69a94d1-60bd-4bd5-90ca-7d7cd50438b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxmsz" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.348141 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6-trusted-ca\") pod \"ingress-operator-5b745b69d9-gvr8j\" (UID: \"60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.348171 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c32d4cb-1966-42fd-ba29-1bda2bcced93-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-866rc\" (UID: \"6c32d4cb-1966-42fd-ba29-1bda2bcced93\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-866rc" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.348208 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82z5g\" (UniqueName: \"kubernetes.io/projected/99dcd7e1-2d1d-4d12-a279-02c56e1d96c6-kube-api-access-82z5g\") pod \"cluster-image-registry-operator-dc59b4c8b-bgtwq\" (UID: \"99dcd7e1-2d1d-4d12-a279-02c56e1d96c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.348235 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gvr8j\" (UID: \"60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.348282 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c32d4cb-1966-42fd-ba29-1bda2bcced93-config\") pod \"kube-controller-manager-operator-78b949d7b-866rc\" (UID: \"6c32d4cb-1966-42fd-ba29-1bda2bcced93\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-866rc" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.348332 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/99dcd7e1-2d1d-4d12-a279-02c56e1d96c6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bgtwq\" (UID: \"99dcd7e1-2d1d-4d12-a279-02c56e1d96c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.348365 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99dcd7e1-2d1d-4d12-a279-02c56e1d96c6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bgtwq\" (UID: \"99dcd7e1-2d1d-4d12-a279-02c56e1d96c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.348428 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a99e06cc-b200-4073-a847-410f9799eb3a-secret-volume\") pod \"collect-profiles-29531685-dg9vc\" (UID: \"a99e06cc-b200-4073-a847-410f9799eb3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.348479 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c32d4cb-1966-42fd-ba29-1bda2bcced93-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-866rc\" (UID: \"6c32d4cb-1966-42fd-ba29-1bda2bcced93\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-866rc" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.348535 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9l7g\" (UniqueName: \"kubernetes.io/projected/60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6-kube-api-access-l9l7g\") pod \"ingress-operator-5b745b69d9-gvr8j\" (UID: \"60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.350264 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99dcd7e1-2d1d-4d12-a279-02c56e1d96c6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bgtwq\" (UID: \"99dcd7e1-2d1d-4d12-a279-02c56e1d96c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.350453 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.356829 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/99dcd7e1-2d1d-4d12-a279-02c56e1d96c6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bgtwq\" (UID: \"99dcd7e1-2d1d-4d12-a279-02c56e1d96c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.370575 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.389857 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.402273 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d69a94d1-60bd-4bd5-90ca-7d7cd50438b6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xxmsz\" (UID: \"d69a94d1-60bd-4bd5-90ca-7d7cd50438b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxmsz" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.410703 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.419157 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d69a94d1-60bd-4bd5-90ca-7d7cd50438b6-config\") pod \"kube-apiserver-operator-766d6c64bb-xxmsz\" (UID: \"d69a94d1-60bd-4bd5-90ca-7d7cd50438b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxmsz" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.431093 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.450055 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.470256 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.490251 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.509672 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.530195 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.550108 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.562414 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c32d4cb-1966-42fd-ba29-1bda2bcced93-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-866rc\" (UID: \"6c32d4cb-1966-42fd-ba29-1bda2bcced93\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-866rc" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.570468 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.589475 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.599279 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c32d4cb-1966-42fd-ba29-1bda2bcced93-config\") pod \"kube-controller-manager-operator-78b949d7b-866rc\" (UID: \"6c32d4cb-1966-42fd-ba29-1bda2bcced93\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-866rc" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.609799 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.631327 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.649635 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.671047 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.690347 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.711006 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.712267 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.727656 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6-metrics-tls\") pod \"ingress-operator-5b745b69d9-gvr8j\" (UID: \"60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.740958 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.749592 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6-trusted-ca\") pod \"ingress-operator-5b745b69d9-gvr8j\" (UID: \"60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.749784 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.769501 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.790436 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.809920 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.831290 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.850274 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.862179 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae95859-0a1a-4095-89a8-77ad9197a9e9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-trp2g\" (UID: \"8ae95859-0a1a-4095-89a8-77ad9197a9e9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trp2g" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.870951 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.880124 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ae95859-0a1a-4095-89a8-77ad9197a9e9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-trp2g\" (UID: \"8ae95859-0a1a-4095-89a8-77ad9197a9e9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trp2g" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.890370 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.930550 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.951157 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.977851 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 24 02:56:32 crc kubenswrapper[4923]: I0224 02:56:32.990558 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.011384 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.031737 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.051130 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.069735 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.090988 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.110214 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.128673 4923 request.go:700] Waited for 1.001790457s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpprof-cert&limit=500&resourceVersion=0 Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.130983 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.141982 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a99e06cc-b200-4073-a847-410f9799eb3a-secret-volume\") pod \"collect-profiles-29531685-dg9vc\" (UID: \"a99e06cc-b200-4073-a847-410f9799eb3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.150988 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.171280 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.190687 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.210895 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.230606 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.250989 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 24 02:56:33 crc kubenswrapper[4923]: E0224 02:56:33.257101 4923 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Feb 24 02:56:33 crc kubenswrapper[4923]: E0224 02:56:33.257292 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-proxy-ca-bundles podName:0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3 nodeName:}" failed. No retries permitted until 2026-02-24 02:56:33.757258114 +0000 UTC m=+117.774328967 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-proxy-ca-bundles") pod "controller-manager-879f6c89f-k2q5j" (UID: "0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3") : failed to sync configmap cache: timed out waiting for the condition Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.272056 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.291774 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.311092 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.325334 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a99e06cc-b200-4073-a847-410f9799eb3a-config-volume\") pod \"collect-profiles-29531685-dg9vc\" (UID: \"a99e06cc-b200-4073-a847-410f9799eb3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.351206 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.370752 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.390926 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.410438 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.431577 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.451837 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.471159 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.490140 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.510068 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.531114 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.551575 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.571267 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.590634 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.611182 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.630969 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.651633 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.671126 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.690603 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.710054 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.715681 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.715782 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.715681 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.741066 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.751501 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.772610 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.775109 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-k2q5j\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.791438 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.811792 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.831270 4923 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.852449 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.892931 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9nhg\" (UniqueName: \"kubernetes.io/projected/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-kube-api-access-p9nhg\") pod \"controller-manager-879f6c89f-k2q5j\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.919866 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t7jh\" (UniqueName: \"kubernetes.io/projected/9b5ccce4-6412-43a0-bcea-5f88c4b3b47c-kube-api-access-2t7jh\") pod \"machine-approver-56656f9798-w6rtx\" (UID: \"9b5ccce4-6412-43a0-bcea-5f88c4b3b47c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.939215 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v55lc\" (UniqueName: \"kubernetes.io/projected/5401c124-d8da-4335-8cd9-b8afc71fc682-kube-api-access-v55lc\") pod \"cluster-samples-operator-665b6dd947-gwpbz\" (UID: \"5401c124-d8da-4335-8cd9-b8afc71fc682\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwpbz" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.947398 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rqd9\" (UniqueName: \"kubernetes.io/projected/c5c02b8b-cae8-4e73-9e6f-34a8120f00c2-kube-api-access-8rqd9\") pod \"downloads-7954f5f757-gsv9x\" (UID: \"c5c02b8b-cae8-4e73-9e6f-34a8120f00c2\") " pod="openshift-console/downloads-7954f5f757-gsv9x" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.974270 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdtgs\" (UniqueName: \"kubernetes.io/projected/3f89d640-5e7f-473b-98e3-420780c10024-kube-api-access-cdtgs\") pod \"console-f9d7485db-w5k6j\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:33 crc kubenswrapper[4923]: I0224 02:56:33.977200 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwpbz" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.000206 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s25w4\" (UniqueName: \"kubernetes.io/projected/c1e09aba-671f-45f8-84e1-8a3813b39383-kube-api-access-s25w4\") pod \"etcd-operator-b45778765-pzbwf\" (UID: \"c1e09aba-671f-45f8-84e1-8a3813b39383\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.006874 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slqjk\" (UniqueName: \"kubernetes.io/projected/ae174d22-78c6-4699-9d9b-8ce566dc9f4c-kube-api-access-slqjk\") pod \"machine-api-operator-5694c8668f-82dxq\" (UID: \"ae174d22-78c6-4699-9d9b-8ce566dc9f4c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-82dxq" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.033182 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.043373 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.046770 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqmdb\" (UniqueName: \"kubernetes.io/projected/ba11e280-cb6f-45fb-8668-80650a1ad7bc-kube-api-access-nqmdb\") pod \"openshift-config-operator-7777fb866f-clfml\" (UID: \"ba11e280-cb6f-45fb-8668-80650a1ad7bc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-clfml" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.056316 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-82dxq" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.074766 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92ksm\" (UniqueName: \"kubernetes.io/projected/88764e4a-5a09-4223-b9b5-d576d2b36f41-kube-api-access-92ksm\") pod \"openshift-apiserver-operator-796bbdcf4f-cldx7\" (UID: \"88764e4a-5a09-4223-b9b5-d576d2b36f41\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cldx7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.088683 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvzrg\" (UniqueName: \"kubernetes.io/projected/9605a72a-cc35-4904-a7f3-bbeff4972542-kube-api-access-gvzrg\") pod \"console-operator-58897d9998-ffdsb\" (UID: \"9605a72a-cc35-4904-a7f3-bbeff4972542\") " pod="openshift-console-operator/console-operator-58897d9998-ffdsb" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.111682 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzh4v\" (UniqueName: \"kubernetes.io/projected/348bc8fb-6fa1-40dc-82ca-99683b7e68ed-kube-api-access-dzh4v\") pod \"openshift-controller-manager-operator-756b6f6bc6-8rxcd\" (UID: \"348bc8fb-6fa1-40dc-82ca-99683b7e68ed\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8rxcd" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.114196 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ffdsb" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.137185 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-clfml" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.138131 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtl7x\" (UniqueName: \"kubernetes.io/projected/52bd6986-2737-455e-bb5c-570aa29f4001-kube-api-access-rtl7x\") pod \"authentication-operator-69f744f599-jdnlf\" (UID: \"52bd6986-2737-455e-bb5c-570aa29f4001\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.146045 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.146548 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8rxcd" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.149499 4923 request.go:700] Waited for 1.889521509s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-oauth-apiserver/serviceaccounts/oauth-apiserver-sa/token Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.150022 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd6ph\" (UniqueName: \"kubernetes.io/projected/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-kube-api-access-wd6ph\") pod \"oauth-openshift-558db77b4-lfds7\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.180345 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84drt\" (UniqueName: \"kubernetes.io/projected/58b2a3df-daae-45b6-8343-dedec3d3ecce-kube-api-access-84drt\") pod \"apiserver-7bbb656c7d-l9wvb\" (UID: \"58b2a3df-daae-45b6-8343-dedec3d3ecce\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.191352 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.191616 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjkgv\" (UniqueName: \"kubernetes.io/projected/bf70a70a-8a77-428b-97ca-2609ccc84a26-kube-api-access-xjkgv\") pod \"apiserver-76f77b778f-ll9tx\" (UID: \"bf70a70a-8a77-428b-97ca-2609ccc84a26\") " pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.195282 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwpbz"] Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.210509 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.214555 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gsv9x" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.231966 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.246783 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.251397 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.265584 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cldx7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.266189 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-w5k6j"] Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.272061 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.276343 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx" event={"ID":"9b5ccce4-6412-43a0-bcea-5f88c4b3b47c","Type":"ContainerStarted","Data":"60f4b71345b65e47ad398f1225cd4ac7120042d05048693514c2945bf4142b48"} Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.292314 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.312131 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.331275 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.350525 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.370709 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.404252 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.410163 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99dcd7e1-2d1d-4d12-a279-02c56e1d96c6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bgtwq\" (UID: \"99dcd7e1-2d1d-4d12-a279-02c56e1d96c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.419141 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.429480 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.444722 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d69a94d1-60bd-4bd5-90ca-7d7cd50438b6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xxmsz\" (UID: \"d69a94d1-60bd-4bd5-90ca-7d7cd50438b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxmsz" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.451431 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46qbb\" (UniqueName: \"kubernetes.io/projected/a99e06cc-b200-4073-a847-410f9799eb3a-kube-api-access-46qbb\") pod \"collect-profiles-29531685-dg9vc\" (UID: \"a99e06cc-b200-4073-a847-410f9799eb3a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.468017 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcw75\" (UniqueName: \"kubernetes.io/projected/8ae95859-0a1a-4095-89a8-77ad9197a9e9-kube-api-access-wcw75\") pod \"kube-storage-version-migrator-operator-b67b599dd-trp2g\" (UID: \"8ae95859-0a1a-4095-89a8-77ad9197a9e9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trp2g" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.479716 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxmsz" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.489493 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c32d4cb-1966-42fd-ba29-1bda2bcced93-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-866rc\" (UID: \"6c32d4cb-1966-42fd-ba29-1bda2bcced93\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-866rc" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.502667 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-866rc" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.520241 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82z5g\" (UniqueName: \"kubernetes.io/projected/99dcd7e1-2d1d-4d12-a279-02c56e1d96c6-kube-api-access-82z5g\") pod \"cluster-image-registry-operator-dc59b4c8b-bgtwq\" (UID: \"99dcd7e1-2d1d-4d12-a279-02c56e1d96c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.537562 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trp2g" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.544694 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gvr8j\" (UID: \"60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.550848 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.551320 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9l7g\" (UniqueName: \"kubernetes.io/projected/60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6-kube-api-access-l9l7g\") pod \"ingress-operator-5b745b69d9-gvr8j\" (UID: \"60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.570075 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.588819 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.611191 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.631916 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.649200 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddqcm\" (UniqueName: \"kubernetes.io/projected/e59578f2-07d5-4eb9-8b58-22a2b4f73a3b-kube-api-access-ddqcm\") pod \"dns-operator-744455d44c-q2s27\" (UID: \"e59578f2-07d5-4eb9-8b58-22a2b4f73a3b\") " pod="openshift-dns-operator/dns-operator-744455d44c-q2s27" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.651924 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-82dxq"] Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.679577 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.687678 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-k2q5j\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.690866 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700287 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e47a3768-fc24-4a98-8ed4-2264127d71cd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2jm5s\" (UID: \"e47a3768-fc24-4a98-8ed4-2264127d71cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700332 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn8s6\" (UniqueName: \"kubernetes.io/projected/e7391920-399a-4773-bc06-5b62900f1206-kube-api-access-gn8s6\") pod \"catalog-operator-68c6474976-jwtm2\" (UID: \"e7391920-399a-4773-bc06-5b62900f1206\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700379 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700398 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c9ck\" (UniqueName: \"kubernetes.io/projected/7227def5-b373-488f-9f56-4b6ed170751d-kube-api-access-2c9ck\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700425 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b13000e-92ec-4de6-9e62-9232504476b4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-74xcz\" (UID: \"4b13000e-92ec-4de6-9e62-9232504476b4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-74xcz" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700462 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10a4e000-d2a9-455f-a7a7-ae4d90611c29-metrics-certs\") pod \"router-default-5444994796-ctnr7\" (UID: \"10a4e000-d2a9-455f-a7a7-ae4d90611c29\") " pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700519 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/10a4e000-d2a9-455f-a7a7-ae4d90611c29-stats-auth\") pod \"router-default-5444994796-ctnr7\" (UID: \"10a4e000-d2a9-455f-a7a7-ae4d90611c29\") " pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700544 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e47a3768-fc24-4a98-8ed4-2264127d71cd-proxy-tls\") pod \"machine-config-operator-74547568cd-2jm5s\" (UID: \"e47a3768-fc24-4a98-8ed4-2264127d71cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700567 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10a4e000-d2a9-455f-a7a7-ae4d90611c29-service-ca-bundle\") pod \"router-default-5444994796-ctnr7\" (UID: \"10a4e000-d2a9-455f-a7a7-ae4d90611c29\") " pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700637 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e7391920-399a-4773-bc06-5b62900f1206-profile-collector-cert\") pod \"catalog-operator-68c6474976-jwtm2\" (UID: \"e7391920-399a-4773-bc06-5b62900f1206\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700680 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7227def5-b373-488f-9f56-4b6ed170751d-bound-sa-token\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700722 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-serving-cert\") pod \"route-controller-manager-6576b87f9c-tw548\" (UID: \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700772 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49g2l\" (UniqueName: \"kubernetes.io/projected/4b13000e-92ec-4de6-9e62-9232504476b4-kube-api-access-49g2l\") pod \"package-server-manager-789f6589d5-74xcz\" (UID: \"4b13000e-92ec-4de6-9e62-9232504476b4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-74xcz" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700791 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-config\") pod \"route-controller-manager-6576b87f9c-tw548\" (UID: \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700809 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7227def5-b373-488f-9f56-4b6ed170751d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700824 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7jxs\" (UniqueName: \"kubernetes.io/projected/8e054d71-6ccc-4ab7-905a-5b1053bdbdc7-kube-api-access-d7jxs\") pod \"migrator-59844c95c7-x278g\" (UID: \"8e054d71-6ccc-4ab7-905a-5b1053bdbdc7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x278g" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700838 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/10a4e000-d2a9-455f-a7a7-ae4d90611c29-default-certificate\") pod \"router-default-5444994796-ctnr7\" (UID: \"10a4e000-d2a9-455f-a7a7-ae4d90611c29\") " pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700852 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px68c\" (UniqueName: \"kubernetes.io/projected/10a4e000-d2a9-455f-a7a7-ae4d90611c29-kube-api-access-px68c\") pod \"router-default-5444994796-ctnr7\" (UID: \"10a4e000-d2a9-455f-a7a7-ae4d90611c29\") " pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700867 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81743785-297e-4cbd-b84e-de806ada8d8f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pjqpk\" (UID: \"81743785-297e-4cbd-b84e-de806ada8d8f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pjqpk" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700881 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/436a7a59-c116-4383-b580-19d167e74eeb-srv-cert\") pod \"olm-operator-6b444d44fb-gg97h\" (UID: \"436a7a59-c116-4383-b580-19d167e74eeb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700898 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7227def5-b373-488f-9f56-4b6ed170751d-trusted-ca\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700912 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-client-ca\") pod \"route-controller-manager-6576b87f9c-tw548\" (UID: \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700928 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7227def5-b373-488f-9f56-4b6ed170751d-registry-certificates\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700952 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e47a3768-fc24-4a98-8ed4-2264127d71cd-images\") pod \"machine-config-operator-74547568cd-2jm5s\" (UID: \"e47a3768-fc24-4a98-8ed4-2264127d71cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.700979 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7227def5-b373-488f-9f56-4b6ed170751d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.701013 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7227def5-b373-488f-9f56-4b6ed170751d-registry-tls\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.701038 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81743785-297e-4cbd-b84e-de806ada8d8f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pjqpk\" (UID: \"81743785-297e-4cbd-b84e-de806ada8d8f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pjqpk" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.701093 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc7k5\" (UniqueName: \"kubernetes.io/projected/436a7a59-c116-4383-b580-19d167e74eeb-kube-api-access-sc7k5\") pod \"olm-operator-6b444d44fb-gg97h\" (UID: \"436a7a59-c116-4383-b580-19d167e74eeb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.701142 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m549\" (UniqueName: \"kubernetes.io/projected/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-kube-api-access-7m549\") pod \"route-controller-manager-6576b87f9c-tw548\" (UID: \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.701163 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81743785-297e-4cbd-b84e-de806ada8d8f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pjqpk\" (UID: \"81743785-297e-4cbd-b84e-de806ada8d8f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pjqpk" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.701200 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdw77\" (UniqueName: \"kubernetes.io/projected/e47a3768-fc24-4a98-8ed4-2264127d71cd-kube-api-access-gdw77\") pod \"machine-config-operator-74547568cd-2jm5s\" (UID: \"e47a3768-fc24-4a98-8ed4-2264127d71cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.701254 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e7391920-399a-4773-bc06-5b62900f1206-srv-cert\") pod \"catalog-operator-68c6474976-jwtm2\" (UID: \"e7391920-399a-4773-bc06-5b62900f1206\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.701284 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/436a7a59-c116-4383-b580-19d167e74eeb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-gg97h\" (UID: \"436a7a59-c116-4383-b580-19d167e74eeb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h" Feb 24 02:56:34 crc kubenswrapper[4923]: E0224 02:56:34.702073 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:35.202048853 +0000 UTC m=+119.219119716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.718897 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.723977 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ffdsb"] Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.724014 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8rxcd"] Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.728940 4923 scope.go:117] "RemoveContainer" containerID="ff398e67eccf34b9e0bc1d34f2285b4a275efa099b3a0891887c9ca8e979d39c" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.733686 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 24 02:56:34 crc kubenswrapper[4923]: W0224 02:56:34.740721 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9605a72a_cc35_4904_a7f3_bbeff4972542.slice/crio-db3bcfc69b820f7c3214d1b4875c37e6a6a873fd3aa6a85e5f09e188de6b9166 WatchSource:0}: Error finding container db3bcfc69b820f7c3214d1b4875c37e6a6a873fd3aa6a85e5f09e188de6b9166: Status 404 returned error can't find the container with id db3bcfc69b820f7c3214d1b4875c37e6a6a873fd3aa6a85e5f09e188de6b9166 Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.750101 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.759013 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.773402 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.776567 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ll9tx"] Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.790850 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-q2s27" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.802266 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.802684 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.803037 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10a4e000-d2a9-455f-a7a7-ae4d90611c29-service-ca-bundle\") pod \"router-default-5444994796-ctnr7\" (UID: \"10a4e000-d2a9-455f-a7a7-ae4d90611c29\") " pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.803115 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvd29\" (UniqueName: \"kubernetes.io/projected/30e1ab71-a068-4593-9dc7-f1f7731caeb9-kube-api-access-wvd29\") pod \"cni-sysctl-allowlist-ds-zswn9\" (UID: \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.803180 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b66c0222-b5e0-4d1e-841e-507c8e61e482-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-292s5\" (UID: \"b66c0222-b5e0-4d1e-841e-507c8e61e482\") " pod="openshift-marketplace/marketplace-operator-79b997595-292s5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.803212 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/30e1ab71-a068-4593-9dc7-f1f7731caeb9-ready\") pod \"cni-sysctl-allowlist-ds-zswn9\" (UID: \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.803498 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8-tmpfs\") pod \"packageserver-d55dfcdfc-6txs5\" (UID: \"b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.803515 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvvxs\" (UniqueName: \"kubernetes.io/projected/93648d39-0b9e-4eda-9f09-a299571f3259-kube-api-access-nvvxs\") pod \"ingress-canary-v6g8h\" (UID: \"93648d39-0b9e-4eda-9f09-a299571f3259\") " pod="openshift-ingress-canary/ingress-canary-v6g8h" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.803666 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjtsp\" (UniqueName: \"kubernetes.io/projected/a210d674-dc51-4a19-8172-a76c264f6f0d-kube-api-access-sjtsp\") pod \"machine-config-server-qgk6z\" (UID: \"a210d674-dc51-4a19-8172-a76c264f6f0d\") " pod="openshift-machine-config-operator/machine-config-server-qgk6z" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.803789 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e7391920-399a-4773-bc06-5b62900f1206-profile-collector-cert\") pod \"catalog-operator-68c6474976-jwtm2\" (UID: \"e7391920-399a-4773-bc06-5b62900f1206\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.803821 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c-config-volume\") pod \"dns-default-9jk4q\" (UID: \"c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c\") " pod="openshift-dns/dns-default-9jk4q" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.803838 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7227def5-b373-488f-9f56-4b6ed170751d-bound-sa-token\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: E0224 02:56:34.804384 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:35.304353239 +0000 UTC m=+119.321424052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.804446 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-serving-cert\") pod \"route-controller-manager-6576b87f9c-tw548\" (UID: \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.804480 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/30e1ab71-a068-4593-9dc7-f1f7731caeb9-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-zswn9\" (UID: \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.804501 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ea015236-90a9-4feb-bc5a-800e9831fcd6-registration-dir\") pod \"csi-hostpathplugin-96lgm\" (UID: \"ea015236-90a9-4feb-bc5a-800e9831fcd6\") " pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.804870 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10a4e000-d2a9-455f-a7a7-ae4d90611c29-service-ca-bundle\") pod \"router-default-5444994796-ctnr7\" (UID: \"10a4e000-d2a9-455f-a7a7-ae4d90611c29\") " pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805493 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49g2l\" (UniqueName: \"kubernetes.io/projected/4b13000e-92ec-4de6-9e62-9232504476b4-kube-api-access-49g2l\") pod \"package-server-manager-789f6589d5-74xcz\" (UID: \"4b13000e-92ec-4de6-9e62-9232504476b4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-74xcz" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805529 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-config\") pod \"route-controller-manager-6576b87f9c-tw548\" (UID: \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805572 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5417593c-9d13-4adf-8579-c7e9331d6aa8-signing-cabundle\") pod \"service-ca-9c57cc56f-7qxgw\" (UID: \"5417593c-9d13-4adf-8579-c7e9331d6aa8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7qxgw" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805590 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7jxs\" (UniqueName: \"kubernetes.io/projected/8e054d71-6ccc-4ab7-905a-5b1053bdbdc7-kube-api-access-d7jxs\") pod \"migrator-59844c95c7-x278g\" (UID: \"8e054d71-6ccc-4ab7-905a-5b1053bdbdc7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x278g" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805607 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/10a4e000-d2a9-455f-a7a7-ae4d90611c29-default-certificate\") pod \"router-default-5444994796-ctnr7\" (UID: \"10a4e000-d2a9-455f-a7a7-ae4d90611c29\") " pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805628 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px68c\" (UniqueName: \"kubernetes.io/projected/10a4e000-d2a9-455f-a7a7-ae4d90611c29-kube-api-access-px68c\") pod \"router-default-5444994796-ctnr7\" (UID: \"10a4e000-d2a9-455f-a7a7-ae4d90611c29\") " pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805649 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7227def5-b373-488f-9f56-4b6ed170751d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805667 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81743785-297e-4cbd-b84e-de806ada8d8f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pjqpk\" (UID: \"81743785-297e-4cbd-b84e-de806ada8d8f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pjqpk" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805684 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/436a7a59-c116-4383-b580-19d167e74eeb-srv-cert\") pod \"olm-operator-6b444d44fb-gg97h\" (UID: \"436a7a59-c116-4383-b580-19d167e74eeb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805702 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxj7v\" (UniqueName: \"kubernetes.io/projected/9dcee5b9-ae18-4145-9af2-c4ebf1e36294-kube-api-access-dxj7v\") pod \"machine-config-controller-84d6567774-vvpzd\" (UID: \"9dcee5b9-ae18-4145-9af2-c4ebf1e36294\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vvpzd" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805732 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5417593c-9d13-4adf-8579-c7e9331d6aa8-signing-key\") pod \"service-ca-9c57cc56f-7qxgw\" (UID: \"5417593c-9d13-4adf-8579-c7e9331d6aa8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7qxgw" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805749 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/34d9b54c-a37d-407b-81c7-ff77a96b7dd8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rx89l\" (UID: \"34d9b54c-a37d-407b-81c7-ff77a96b7dd8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rx89l" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805785 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7227def5-b373-488f-9f56-4b6ed170751d-trusted-ca\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805803 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-client-ca\") pod \"route-controller-manager-6576b87f9c-tw548\" (UID: \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805831 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7227def5-b373-488f-9f56-4b6ed170751d-registry-certificates\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805849 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98r85\" (UniqueName: \"kubernetes.io/projected/1ea1c9d1-2053-4519-a342-979703b00a41-kube-api-access-98r85\") pod \"service-ca-operator-777779d784-kmr2g\" (UID: \"1ea1c9d1-2053-4519-a342-979703b00a41\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kmr2g" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805871 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e47a3768-fc24-4a98-8ed4-2264127d71cd-images\") pod \"machine-config-operator-74547568cd-2jm5s\" (UID: \"e47a3768-fc24-4a98-8ed4-2264127d71cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805902 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ea015236-90a9-4feb-bc5a-800e9831fcd6-socket-dir\") pod \"csi-hostpathplugin-96lgm\" (UID: \"ea015236-90a9-4feb-bc5a-800e9831fcd6\") " pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805922 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7227def5-b373-488f-9f56-4b6ed170751d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805942 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8827\" (UniqueName: \"kubernetes.io/projected/b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8-kube-api-access-g8827\") pod \"packageserver-d55dfcdfc-6txs5\" (UID: \"b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.805960 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7227def5-b373-488f-9f56-4b6ed170751d-registry-tls\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.807429 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81743785-297e-4cbd-b84e-de806ada8d8f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pjqpk\" (UID: \"81743785-297e-4cbd-b84e-de806ada8d8f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pjqpk" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.807578 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9dcee5b9-ae18-4145-9af2-c4ebf1e36294-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vvpzd\" (UID: \"9dcee5b9-ae18-4145-9af2-c4ebf1e36294\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vvpzd" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.807725 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/30e1ab71-a068-4593-9dc7-f1f7731caeb9-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-zswn9\" (UID: \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.807816 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ea015236-90a9-4feb-bc5a-800e9831fcd6-csi-data-dir\") pod \"csi-hostpathplugin-96lgm\" (UID: \"ea015236-90a9-4feb-bc5a-800e9831fcd6\") " pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.807948 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a210d674-dc51-4a19-8172-a76c264f6f0d-certs\") pod \"machine-config-server-qgk6z\" (UID: \"a210d674-dc51-4a19-8172-a76c264f6f0d\") " pod="openshift-machine-config-operator/machine-config-server-qgk6z" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808215 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc7k5\" (UniqueName: \"kubernetes.io/projected/436a7a59-c116-4383-b580-19d167e74eeb-kube-api-access-sc7k5\") pod \"olm-operator-6b444d44fb-gg97h\" (UID: \"436a7a59-c116-4383-b580-19d167e74eeb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808369 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-config\") pod \"route-controller-manager-6576b87f9c-tw548\" (UID: \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808382 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n2zj\" (UniqueName: \"kubernetes.io/projected/34d9b54c-a37d-407b-81c7-ff77a96b7dd8-kube-api-access-7n2zj\") pod \"control-plane-machine-set-operator-78cbb6b69f-rx89l\" (UID: \"34d9b54c-a37d-407b-81c7-ff77a96b7dd8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rx89l" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808463 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c-metrics-tls\") pod \"dns-default-9jk4q\" (UID: \"c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c\") " pod="openshift-dns/dns-default-9jk4q" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808484 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8-webhook-cert\") pod \"packageserver-d55dfcdfc-6txs5\" (UID: \"b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808503 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ea1c9d1-2053-4519-a342-979703b00a41-serving-cert\") pod \"service-ca-operator-777779d784-kmr2g\" (UID: \"1ea1c9d1-2053-4519-a342-979703b00a41\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kmr2g" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808570 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m549\" (UniqueName: \"kubernetes.io/projected/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-kube-api-access-7m549\") pod \"route-controller-manager-6576b87f9c-tw548\" (UID: \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808601 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81743785-297e-4cbd-b84e-de806ada8d8f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pjqpk\" (UID: \"81743785-297e-4cbd-b84e-de806ada8d8f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pjqpk" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808631 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v979w\" (UniqueName: \"kubernetes.io/projected/ea015236-90a9-4feb-bc5a-800e9831fcd6-kube-api-access-v979w\") pod \"csi-hostpathplugin-96lgm\" (UID: \"ea015236-90a9-4feb-bc5a-800e9831fcd6\") " pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808665 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdw77\" (UniqueName: \"kubernetes.io/projected/e47a3768-fc24-4a98-8ed4-2264127d71cd-kube-api-access-gdw77\") pod \"machine-config-operator-74547568cd-2jm5s\" (UID: \"e47a3768-fc24-4a98-8ed4-2264127d71cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808713 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e7391920-399a-4773-bc06-5b62900f1206-srv-cert\") pod \"catalog-operator-68c6474976-jwtm2\" (UID: \"e7391920-399a-4773-bc06-5b62900f1206\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808753 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9mz2\" (UniqueName: \"kubernetes.io/projected/b48a43d4-438d-40a9-8982-74ea7020664a-kube-api-access-x9mz2\") pod \"multus-admission-controller-857f4d67dd-hkwfz\" (UID: \"b48a43d4-438d-40a9-8982-74ea7020664a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hkwfz" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808786 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/436a7a59-c116-4383-b580-19d167e74eeb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-gg97h\" (UID: \"436a7a59-c116-4383-b580-19d167e74eeb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808813 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b48a43d4-438d-40a9-8982-74ea7020664a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hkwfz\" (UID: \"b48a43d4-438d-40a9-8982-74ea7020664a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hkwfz" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808844 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e47a3768-fc24-4a98-8ed4-2264127d71cd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2jm5s\" (UID: \"e47a3768-fc24-4a98-8ed4-2264127d71cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808871 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn8s6\" (UniqueName: \"kubernetes.io/projected/e7391920-399a-4773-bc06-5b62900f1206-kube-api-access-gn8s6\") pod \"catalog-operator-68c6474976-jwtm2\" (UID: \"e7391920-399a-4773-bc06-5b62900f1206\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808910 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfr2k\" (UniqueName: \"kubernetes.io/projected/b66c0222-b5e0-4d1e-841e-507c8e61e482-kube-api-access-pfr2k\") pod \"marketplace-operator-79b997595-292s5\" (UID: \"b66c0222-b5e0-4d1e-841e-507c8e61e482\") " pod="openshift-marketplace/marketplace-operator-79b997595-292s5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808939 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93648d39-0b9e-4eda-9f09-a299571f3259-cert\") pod \"ingress-canary-v6g8h\" (UID: \"93648d39-0b9e-4eda-9f09-a299571f3259\") " pod="openshift-ingress-canary/ingress-canary-v6g8h" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808981 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.809006 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c9ck\" (UniqueName: \"kubernetes.io/projected/7227def5-b373-488f-9f56-4b6ed170751d-kube-api-access-2c9ck\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.809094 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b13000e-92ec-4de6-9e62-9232504476b4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-74xcz\" (UID: \"4b13000e-92ec-4de6-9e62-9232504476b4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-74xcz" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.809160 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ea015236-90a9-4feb-bc5a-800e9831fcd6-plugins-dir\") pod \"csi-hostpathplugin-96lgm\" (UID: \"ea015236-90a9-4feb-bc5a-800e9831fcd6\") " pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.809215 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8-apiservice-cert\") pod \"packageserver-d55dfcdfc-6txs5\" (UID: \"b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.809242 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10a4e000-d2a9-455f-a7a7-ae4d90611c29-metrics-certs\") pod \"router-default-5444994796-ctnr7\" (UID: \"10a4e000-d2a9-455f-a7a7-ae4d90611c29\") " pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.809261 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a210d674-dc51-4a19-8172-a76c264f6f0d-node-bootstrap-token\") pod \"machine-config-server-qgk6z\" (UID: \"a210d674-dc51-4a19-8172-a76c264f6f0d\") " pod="openshift-machine-config-operator/machine-config-server-qgk6z" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.809286 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9dcee5b9-ae18-4145-9af2-c4ebf1e36294-proxy-tls\") pod \"machine-config-controller-84d6567774-vvpzd\" (UID: \"9dcee5b9-ae18-4145-9af2-c4ebf1e36294\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vvpzd" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.809338 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b66c0222-b5e0-4d1e-841e-507c8e61e482-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-292s5\" (UID: \"b66c0222-b5e0-4d1e-841e-507c8e61e482\") " pod="openshift-marketplace/marketplace-operator-79b997595-292s5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.809366 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxxsd\" (UniqueName: \"kubernetes.io/projected/c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c-kube-api-access-pxxsd\") pod \"dns-default-9jk4q\" (UID: \"c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c\") " pod="openshift-dns/dns-default-9jk4q" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.809401 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ea1c9d1-2053-4519-a342-979703b00a41-config\") pod \"service-ca-operator-777779d784-kmr2g\" (UID: \"1ea1c9d1-2053-4519-a342-979703b00a41\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kmr2g" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.809458 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ea015236-90a9-4feb-bc5a-800e9831fcd6-mountpoint-dir\") pod \"csi-hostpathplugin-96lgm\" (UID: \"ea015236-90a9-4feb-bc5a-800e9831fcd6\") " pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.809499 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/10a4e000-d2a9-455f-a7a7-ae4d90611c29-stats-auth\") pod \"router-default-5444994796-ctnr7\" (UID: \"10a4e000-d2a9-455f-a7a7-ae4d90611c29\") " pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.809518 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2hj4\" (UniqueName: \"kubernetes.io/projected/5417593c-9d13-4adf-8579-c7e9331d6aa8-kube-api-access-h2hj4\") pod \"service-ca-9c57cc56f-7qxgw\" (UID: \"5417593c-9d13-4adf-8579-c7e9331d6aa8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7qxgw" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.809544 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e47a3768-fc24-4a98-8ed4-2264127d71cd-proxy-tls\") pod \"machine-config-operator-74547568cd-2jm5s\" (UID: \"e47a3768-fc24-4a98-8ed4-2264127d71cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.808384 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7227def5-b373-488f-9f56-4b6ed170751d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.813161 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81743785-297e-4cbd-b84e-de806ada8d8f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pjqpk\" (UID: \"81743785-297e-4cbd-b84e-de806ada8d8f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pjqpk" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.813914 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e47a3768-fc24-4a98-8ed4-2264127d71cd-images\") pod \"machine-config-operator-74547568cd-2jm5s\" (UID: \"e47a3768-fc24-4a98-8ed4-2264127d71cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.815225 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-client-ca\") pod \"route-controller-manager-6576b87f9c-tw548\" (UID: \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.816017 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/436a7a59-c116-4383-b580-19d167e74eeb-srv-cert\") pod \"olm-operator-6b444d44fb-gg97h\" (UID: \"436a7a59-c116-4383-b580-19d167e74eeb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.816158 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7227def5-b373-488f-9f56-4b6ed170751d-trusted-ca\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: E0224 02:56:34.817673 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:35.31765145 +0000 UTC m=+119.334722263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.818099 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.818836 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-clfml"] Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.819279 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b13000e-92ec-4de6-9e62-9232504476b4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-74xcz\" (UID: \"4b13000e-92ec-4de6-9e62-9232504476b4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-74xcz" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.820898 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/10a4e000-d2a9-455f-a7a7-ae4d90611c29-default-certificate\") pod \"router-default-5444994796-ctnr7\" (UID: \"10a4e000-d2a9-455f-a7a7-ae4d90611c29\") " pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.821140 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/10a4e000-d2a9-455f-a7a7-ae4d90611c29-stats-auth\") pod \"router-default-5444994796-ctnr7\" (UID: \"10a4e000-d2a9-455f-a7a7-ae4d90611c29\") " pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.821156 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-serving-cert\") pod \"route-controller-manager-6576b87f9c-tw548\" (UID: \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.821558 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7227def5-b373-488f-9f56-4b6ed170751d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.821668 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e7391920-399a-4773-bc06-5b62900f1206-srv-cert\") pod \"catalog-operator-68c6474976-jwtm2\" (UID: \"e7391920-399a-4773-bc06-5b62900f1206\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.821819 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e7391920-399a-4773-bc06-5b62900f1206-profile-collector-cert\") pod \"catalog-operator-68c6474976-jwtm2\" (UID: \"e7391920-399a-4773-bc06-5b62900f1206\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.823086 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/436a7a59-c116-4383-b580-19d167e74eeb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-gg97h\" (UID: \"436a7a59-c116-4383-b580-19d167e74eeb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.822944 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e47a3768-fc24-4a98-8ed4-2264127d71cd-proxy-tls\") pod \"machine-config-operator-74547568cd-2jm5s\" (UID: \"e47a3768-fc24-4a98-8ed4-2264127d71cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.825040 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10a4e000-d2a9-455f-a7a7-ae4d90611c29-metrics-certs\") pod \"router-default-5444994796-ctnr7\" (UID: \"10a4e000-d2a9-455f-a7a7-ae4d90611c29\") " pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.825752 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7227def5-b373-488f-9f56-4b6ed170751d-registry-certificates\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.826287 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e47a3768-fc24-4a98-8ed4-2264127d71cd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2jm5s\" (UID: \"e47a3768-fc24-4a98-8ed4-2264127d71cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.826944 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81743785-297e-4cbd-b84e-de806ada8d8f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pjqpk\" (UID: \"81743785-297e-4cbd-b84e-de806ada8d8f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pjqpk" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.829683 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7227def5-b373-488f-9f56-4b6ed170751d-registry-tls\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.833657 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pzbwf"] Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.845921 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7227def5-b373-488f-9f56-4b6ed170751d-bound-sa-token\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.866524 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px68c\" (UniqueName: \"kubernetes.io/projected/10a4e000-d2a9-455f-a7a7-ae4d90611c29-kube-api-access-px68c\") pod \"router-default-5444994796-ctnr7\" (UID: \"10a4e000-d2a9-455f-a7a7-ae4d90611c29\") " pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.892718 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cldx7"] Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.896535 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49g2l\" (UniqueName: \"kubernetes.io/projected/4b13000e-92ec-4de6-9e62-9232504476b4-kube-api-access-49g2l\") pod \"package-server-manager-789f6589d5-74xcz\" (UID: \"4b13000e-92ec-4de6-9e62-9232504476b4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-74xcz" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.898750 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gsv9x"] Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.901945 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trp2g"] Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912124 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:34 crc kubenswrapper[4923]: E0224 02:56:34.912181 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:35.4121598 +0000 UTC m=+119.429230613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912349 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9dcee5b9-ae18-4145-9af2-c4ebf1e36294-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vvpzd\" (UID: \"9dcee5b9-ae18-4145-9af2-c4ebf1e36294\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vvpzd" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912375 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/30e1ab71-a068-4593-9dc7-f1f7731caeb9-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-zswn9\" (UID: \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912392 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ea015236-90a9-4feb-bc5a-800e9831fcd6-csi-data-dir\") pod \"csi-hostpathplugin-96lgm\" (UID: \"ea015236-90a9-4feb-bc5a-800e9831fcd6\") " pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912441 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a210d674-dc51-4a19-8172-a76c264f6f0d-certs\") pod \"machine-config-server-qgk6z\" (UID: \"a210d674-dc51-4a19-8172-a76c264f6f0d\") " pod="openshift-machine-config-operator/machine-config-server-qgk6z" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912467 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n2zj\" (UniqueName: \"kubernetes.io/projected/34d9b54c-a37d-407b-81c7-ff77a96b7dd8-kube-api-access-7n2zj\") pod \"control-plane-machine-set-operator-78cbb6b69f-rx89l\" (UID: \"34d9b54c-a37d-407b-81c7-ff77a96b7dd8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rx89l" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912487 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c-metrics-tls\") pod \"dns-default-9jk4q\" (UID: \"c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c\") " pod="openshift-dns/dns-default-9jk4q" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912503 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8-webhook-cert\") pod \"packageserver-d55dfcdfc-6txs5\" (UID: \"b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912522 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ea1c9d1-2053-4519-a342-979703b00a41-serving-cert\") pod \"service-ca-operator-777779d784-kmr2g\" (UID: \"1ea1c9d1-2053-4519-a342-979703b00a41\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kmr2g" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912543 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v979w\" (UniqueName: \"kubernetes.io/projected/ea015236-90a9-4feb-bc5a-800e9831fcd6-kube-api-access-v979w\") pod \"csi-hostpathplugin-96lgm\" (UID: \"ea015236-90a9-4feb-bc5a-800e9831fcd6\") " pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912572 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9mz2\" (UniqueName: \"kubernetes.io/projected/b48a43d4-438d-40a9-8982-74ea7020664a-kube-api-access-x9mz2\") pod \"multus-admission-controller-857f4d67dd-hkwfz\" (UID: \"b48a43d4-438d-40a9-8982-74ea7020664a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hkwfz" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912591 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b48a43d4-438d-40a9-8982-74ea7020664a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hkwfz\" (UID: \"b48a43d4-438d-40a9-8982-74ea7020664a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hkwfz" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912618 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfr2k\" (UniqueName: \"kubernetes.io/projected/b66c0222-b5e0-4d1e-841e-507c8e61e482-kube-api-access-pfr2k\") pod \"marketplace-operator-79b997595-292s5\" (UID: \"b66c0222-b5e0-4d1e-841e-507c8e61e482\") " pod="openshift-marketplace/marketplace-operator-79b997595-292s5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912635 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93648d39-0b9e-4eda-9f09-a299571f3259-cert\") pod \"ingress-canary-v6g8h\" (UID: \"93648d39-0b9e-4eda-9f09-a299571f3259\") " pod="openshift-ingress-canary/ingress-canary-v6g8h" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912656 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912681 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ea015236-90a9-4feb-bc5a-800e9831fcd6-plugins-dir\") pod \"csi-hostpathplugin-96lgm\" (UID: \"ea015236-90a9-4feb-bc5a-800e9831fcd6\") " pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912698 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8-apiservice-cert\") pod \"packageserver-d55dfcdfc-6txs5\" (UID: \"b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912715 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a210d674-dc51-4a19-8172-a76c264f6f0d-node-bootstrap-token\") pod \"machine-config-server-qgk6z\" (UID: \"a210d674-dc51-4a19-8172-a76c264f6f0d\") " pod="openshift-machine-config-operator/machine-config-server-qgk6z" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912735 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9dcee5b9-ae18-4145-9af2-c4ebf1e36294-proxy-tls\") pod \"machine-config-controller-84d6567774-vvpzd\" (UID: \"9dcee5b9-ae18-4145-9af2-c4ebf1e36294\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vvpzd" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912754 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b66c0222-b5e0-4d1e-841e-507c8e61e482-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-292s5\" (UID: \"b66c0222-b5e0-4d1e-841e-507c8e61e482\") " pod="openshift-marketplace/marketplace-operator-79b997595-292s5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912770 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxxsd\" (UniqueName: \"kubernetes.io/projected/c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c-kube-api-access-pxxsd\") pod \"dns-default-9jk4q\" (UID: \"c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c\") " pod="openshift-dns/dns-default-9jk4q" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912787 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ea1c9d1-2053-4519-a342-979703b00a41-config\") pod \"service-ca-operator-777779d784-kmr2g\" (UID: \"1ea1c9d1-2053-4519-a342-979703b00a41\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kmr2g" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912805 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ea015236-90a9-4feb-bc5a-800e9831fcd6-mountpoint-dir\") pod \"csi-hostpathplugin-96lgm\" (UID: \"ea015236-90a9-4feb-bc5a-800e9831fcd6\") " pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912829 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2hj4\" (UniqueName: \"kubernetes.io/projected/5417593c-9d13-4adf-8579-c7e9331d6aa8-kube-api-access-h2hj4\") pod \"service-ca-9c57cc56f-7qxgw\" (UID: \"5417593c-9d13-4adf-8579-c7e9331d6aa8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7qxgw" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912848 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvd29\" (UniqueName: \"kubernetes.io/projected/30e1ab71-a068-4593-9dc7-f1f7731caeb9-kube-api-access-wvd29\") pod \"cni-sysctl-allowlist-ds-zswn9\" (UID: \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912864 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b66c0222-b5e0-4d1e-841e-507c8e61e482-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-292s5\" (UID: \"b66c0222-b5e0-4d1e-841e-507c8e61e482\") " pod="openshift-marketplace/marketplace-operator-79b997595-292s5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912879 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/30e1ab71-a068-4593-9dc7-f1f7731caeb9-ready\") pod \"cni-sysctl-allowlist-ds-zswn9\" (UID: \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912904 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8-tmpfs\") pod \"packageserver-d55dfcdfc-6txs5\" (UID: \"b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912921 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvvxs\" (UniqueName: \"kubernetes.io/projected/93648d39-0b9e-4eda-9f09-a299571f3259-kube-api-access-nvvxs\") pod \"ingress-canary-v6g8h\" (UID: \"93648d39-0b9e-4eda-9f09-a299571f3259\") " pod="openshift-ingress-canary/ingress-canary-v6g8h" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912943 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjtsp\" (UniqueName: \"kubernetes.io/projected/a210d674-dc51-4a19-8172-a76c264f6f0d-kube-api-access-sjtsp\") pod \"machine-config-server-qgk6z\" (UID: \"a210d674-dc51-4a19-8172-a76c264f6f0d\") " pod="openshift-machine-config-operator/machine-config-server-qgk6z" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912961 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c-config-volume\") pod \"dns-default-9jk4q\" (UID: \"c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c\") " pod="openshift-dns/dns-default-9jk4q" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.912979 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ea015236-90a9-4feb-bc5a-800e9831fcd6-registration-dir\") pod \"csi-hostpathplugin-96lgm\" (UID: \"ea015236-90a9-4feb-bc5a-800e9831fcd6\") " pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.913002 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/30e1ab71-a068-4593-9dc7-f1f7731caeb9-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-zswn9\" (UID: \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.913047 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5417593c-9d13-4adf-8579-c7e9331d6aa8-signing-cabundle\") pod \"service-ca-9c57cc56f-7qxgw\" (UID: \"5417593c-9d13-4adf-8579-c7e9331d6aa8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7qxgw" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.913077 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxj7v\" (UniqueName: \"kubernetes.io/projected/9dcee5b9-ae18-4145-9af2-c4ebf1e36294-kube-api-access-dxj7v\") pod \"machine-config-controller-84d6567774-vvpzd\" (UID: \"9dcee5b9-ae18-4145-9af2-c4ebf1e36294\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vvpzd" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.913095 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5417593c-9d13-4adf-8579-c7e9331d6aa8-signing-key\") pod \"service-ca-9c57cc56f-7qxgw\" (UID: \"5417593c-9d13-4adf-8579-c7e9331d6aa8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7qxgw" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.913123 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/34d9b54c-a37d-407b-81c7-ff77a96b7dd8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rx89l\" (UID: \"34d9b54c-a37d-407b-81c7-ff77a96b7dd8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rx89l" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.913143 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98r85\" (UniqueName: \"kubernetes.io/projected/1ea1c9d1-2053-4519-a342-979703b00a41-kube-api-access-98r85\") pod \"service-ca-operator-777779d784-kmr2g\" (UID: \"1ea1c9d1-2053-4519-a342-979703b00a41\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kmr2g" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.913160 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ea015236-90a9-4feb-bc5a-800e9831fcd6-socket-dir\") pod \"csi-hostpathplugin-96lgm\" (UID: \"ea015236-90a9-4feb-bc5a-800e9831fcd6\") " pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.913178 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8827\" (UniqueName: \"kubernetes.io/projected/b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8-kube-api-access-g8827\") pod \"packageserver-d55dfcdfc-6txs5\" (UID: \"b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.913996 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9dcee5b9-ae18-4145-9af2-c4ebf1e36294-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vvpzd\" (UID: \"9dcee5b9-ae18-4145-9af2-c4ebf1e36294\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vvpzd" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.914120 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/30e1ab71-a068-4593-9dc7-f1f7731caeb9-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-zswn9\" (UID: \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.914311 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ea015236-90a9-4feb-bc5a-800e9831fcd6-csi-data-dir\") pod \"csi-hostpathplugin-96lgm\" (UID: \"ea015236-90a9-4feb-bc5a-800e9831fcd6\") " pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.915921 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c-config-volume\") pod \"dns-default-9jk4q\" (UID: \"c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c\") " pod="openshift-dns/dns-default-9jk4q" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.916075 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ea015236-90a9-4feb-bc5a-800e9831fcd6-mountpoint-dir\") pod \"csi-hostpathplugin-96lgm\" (UID: \"ea015236-90a9-4feb-bc5a-800e9831fcd6\") " pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.917845 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/30e1ab71-a068-4593-9dc7-f1f7731caeb9-ready\") pod \"cni-sysctl-allowlist-ds-zswn9\" (UID: \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.918454 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8-tmpfs\") pod \"packageserver-d55dfcdfc-6txs5\" (UID: \"b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.918998 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ea015236-90a9-4feb-bc5a-800e9831fcd6-registration-dir\") pod \"csi-hostpathplugin-96lgm\" (UID: \"ea015236-90a9-4feb-bc5a-800e9831fcd6\") " pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.919139 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b66c0222-b5e0-4d1e-841e-507c8e61e482-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-292s5\" (UID: \"b66c0222-b5e0-4d1e-841e-507c8e61e482\") " pod="openshift-marketplace/marketplace-operator-79b997595-292s5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.919710 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/30e1ab71-a068-4593-9dc7-f1f7731caeb9-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-zswn9\" (UID: \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.919932 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ea015236-90a9-4feb-bc5a-800e9831fcd6-socket-dir\") pod \"csi-hostpathplugin-96lgm\" (UID: \"ea015236-90a9-4feb-bc5a-800e9831fcd6\") " pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.920271 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5417593c-9d13-4adf-8579-c7e9331d6aa8-signing-cabundle\") pod \"service-ca-9c57cc56f-7qxgw\" (UID: \"5417593c-9d13-4adf-8579-c7e9331d6aa8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7qxgw" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.922128 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/34d9b54c-a37d-407b-81c7-ff77a96b7dd8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rx89l\" (UID: \"34d9b54c-a37d-407b-81c7-ff77a96b7dd8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rx89l" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.923241 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c-metrics-tls\") pod \"dns-default-9jk4q\" (UID: \"c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c\") " pod="openshift-dns/dns-default-9jk4q" Feb 24 02:56:34 crc kubenswrapper[4923]: E0224 02:56:34.924954 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:35.424924966 +0000 UTC m=+119.441995989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.926123 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ea015236-90a9-4feb-bc5a-800e9831fcd6-plugins-dir\") pod \"csi-hostpathplugin-96lgm\" (UID: \"ea015236-90a9-4feb-bc5a-800e9831fcd6\") " pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.926731 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a210d674-dc51-4a19-8172-a76c264f6f0d-certs\") pod \"machine-config-server-qgk6z\" (UID: \"a210d674-dc51-4a19-8172-a76c264f6f0d\") " pod="openshift-machine-config-operator/machine-config-server-qgk6z" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.927273 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ea1c9d1-2053-4519-a342-979703b00a41-config\") pod \"service-ca-operator-777779d784-kmr2g\" (UID: \"1ea1c9d1-2053-4519-a342-979703b00a41\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kmr2g" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.927868 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ea1c9d1-2053-4519-a342-979703b00a41-serving-cert\") pod \"service-ca-operator-777779d784-kmr2g\" (UID: \"1ea1c9d1-2053-4519-a342-979703b00a41\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kmr2g" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.928123 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8-apiservice-cert\") pod \"packageserver-d55dfcdfc-6txs5\" (UID: \"b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.930736 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b48a43d4-438d-40a9-8982-74ea7020664a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hkwfz\" (UID: \"b48a43d4-438d-40a9-8982-74ea7020664a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hkwfz" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.931227 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8-webhook-cert\") pod \"packageserver-d55dfcdfc-6txs5\" (UID: \"b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.932180 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93648d39-0b9e-4eda-9f09-a299571f3259-cert\") pod \"ingress-canary-v6g8h\" (UID: \"93648d39-0b9e-4eda-9f09-a299571f3259\") " pod="openshift-ingress-canary/ingress-canary-v6g8h" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.932936 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5417593c-9d13-4adf-8579-c7e9331d6aa8-signing-key\") pod \"service-ca-9c57cc56f-7qxgw\" (UID: \"5417593c-9d13-4adf-8579-c7e9331d6aa8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7qxgw" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.940004 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9dcee5b9-ae18-4145-9af2-c4ebf1e36294-proxy-tls\") pod \"machine-config-controller-84d6567774-vvpzd\" (UID: \"9dcee5b9-ae18-4145-9af2-c4ebf1e36294\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vvpzd" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.942012 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81743785-297e-4cbd-b84e-de806ada8d8f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pjqpk\" (UID: \"81743785-297e-4cbd-b84e-de806ada8d8f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pjqpk" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.947071 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a210d674-dc51-4a19-8172-a76c264f6f0d-node-bootstrap-token\") pod \"machine-config-server-qgk6z\" (UID: \"a210d674-dc51-4a19-8172-a76c264f6f0d\") " pod="openshift-machine-config-operator/machine-config-server-qgk6z" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.958278 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc7k5\" (UniqueName: \"kubernetes.io/projected/436a7a59-c116-4383-b580-19d167e74eeb-kube-api-access-sc7k5\") pod \"olm-operator-6b444d44fb-gg97h\" (UID: \"436a7a59-c116-4383-b580-19d167e74eeb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.966082 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7jxs\" (UniqueName: \"kubernetes.io/projected/8e054d71-6ccc-4ab7-905a-5b1053bdbdc7-kube-api-access-d7jxs\") pod \"migrator-59844c95c7-x278g\" (UID: \"8e054d71-6ccc-4ab7-905a-5b1053bdbdc7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x278g" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.970041 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b66c0222-b5e0-4d1e-841e-507c8e61e482-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-292s5\" (UID: \"b66c0222-b5e0-4d1e-841e-507c8e61e482\") " pod="openshift-marketplace/marketplace-operator-79b997595-292s5" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.973592 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m549\" (UniqueName: \"kubernetes.io/projected/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-kube-api-access-7m549\") pod \"route-controller-manager-6576b87f9c-tw548\" (UID: \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" Feb 24 02:56:34 crc kubenswrapper[4923]: I0224 02:56:34.991974 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn8s6\" (UniqueName: \"kubernetes.io/projected/e7391920-399a-4773-bc06-5b62900f1206-kube-api-access-gn8s6\") pod \"catalog-operator-68c6474976-jwtm2\" (UID: \"e7391920-399a-4773-bc06-5b62900f1206\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.013938 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c9ck\" (UniqueName: \"kubernetes.io/projected/7227def5-b373-488f-9f56-4b6ed170751d-kube-api-access-2c9ck\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.014992 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:35 crc kubenswrapper[4923]: E0224 02:56:35.015459 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:35.515433081 +0000 UTC m=+119.532503894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.019173 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jdnlf"] Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.037581 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lfds7"] Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.058138 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb"] Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.059168 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdw77\" (UniqueName: \"kubernetes.io/projected/e47a3768-fc24-4a98-8ed4-2264127d71cd-kube-api-access-gdw77\") pod \"machine-config-operator-74547568cd-2jm5s\" (UID: \"e47a3768-fc24-4a98-8ed4-2264127d71cd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.064657 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.070791 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k2q5j"] Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.072212 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxmsz"] Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.083969 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-866rc"] Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.092556 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pjqpk" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.093449 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq"] Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.106090 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8827\" (UniqueName: \"kubernetes.io/projected/b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8-kube-api-access-g8827\") pod \"packageserver-d55dfcdfc-6txs5\" (UID: \"b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.107883 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxxsd\" (UniqueName: \"kubernetes.io/projected/c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c-kube-api-access-pxxsd\") pod \"dns-default-9jk4q\" (UID: \"c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c\") " pod="openshift-dns/dns-default-9jk4q" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.117017 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:35 crc kubenswrapper[4923]: E0224 02:56:35.117481 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:35.61746651 +0000 UTC m=+119.634537323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.125980 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x278g" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.151585 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.158792 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.161103 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc"] Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.163677 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.170727 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.177289 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-74xcz" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.193102 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.201902 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2hj4\" (UniqueName: \"kubernetes.io/projected/5417593c-9d13-4adf-8579-c7e9331d6aa8-kube-api-access-h2hj4\") pod \"service-ca-9c57cc56f-7qxgw\" (UID: \"5417593c-9d13-4adf-8579-c7e9331d6aa8\") " pod="openshift-service-ca/service-ca-9c57cc56f-7qxgw" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.203279 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvd29\" (UniqueName: \"kubernetes.io/projected/30e1ab71-a068-4593-9dc7-f1f7731caeb9-kube-api-access-wvd29\") pod \"cni-sysctl-allowlist-ds-zswn9\" (UID: \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.203978 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjtsp\" (UniqueName: \"kubernetes.io/projected/a210d674-dc51-4a19-8172-a76c264f6f0d-kube-api-access-sjtsp\") pod \"machine-config-server-qgk6z\" (UID: \"a210d674-dc51-4a19-8172-a76c264f6f0d\") " pod="openshift-machine-config-operator/machine-config-server-qgk6z" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.207495 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvvxs\" (UniqueName: \"kubernetes.io/projected/93648d39-0b9e-4eda-9f09-a299571f3259-kube-api-access-nvvxs\") pod \"ingress-canary-v6g8h\" (UID: \"93648d39-0b9e-4eda-9f09-a299571f3259\") " pod="openshift-ingress-canary/ingress-canary-v6g8h" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.207513 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98r85\" (UniqueName: \"kubernetes.io/projected/1ea1c9d1-2053-4519-a342-979703b00a41-kube-api-access-98r85\") pod \"service-ca-operator-777779d784-kmr2g\" (UID: \"1ea1c9d1-2053-4519-a342-979703b00a41\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kmr2g" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.207841 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxj7v\" (UniqueName: \"kubernetes.io/projected/9dcee5b9-ae18-4145-9af2-c4ebf1e36294-kube-api-access-dxj7v\") pod \"machine-config-controller-84d6567774-vvpzd\" (UID: \"9dcee5b9-ae18-4145-9af2-c4ebf1e36294\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vvpzd" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.212057 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7qxgw" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.218137 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:35 crc kubenswrapper[4923]: E0224 02:56:35.218533 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:35.718515272 +0000 UTC m=+119.735586085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.222039 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kmr2g" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.230917 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfr2k\" (UniqueName: \"kubernetes.io/projected/b66c0222-b5e0-4d1e-841e-507c8e61e482-kube-api-access-pfr2k\") pod \"marketplace-operator-79b997595-292s5\" (UID: \"b66c0222-b5e0-4d1e-841e-507c8e61e482\") " pod="openshift-marketplace/marketplace-operator-79b997595-292s5" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.235055 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.238187 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j"] Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.247607 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v979w\" (UniqueName: \"kubernetes.io/projected/ea015236-90a9-4feb-bc5a-800e9831fcd6-kube-api-access-v979w\") pod \"csi-hostpathplugin-96lgm\" (UID: \"ea015236-90a9-4feb-bc5a-800e9831fcd6\") " pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.261190 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-96lgm" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.263420 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9jk4q" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.267558 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9mz2\" (UniqueName: \"kubernetes.io/projected/b48a43d4-438d-40a9-8982-74ea7020664a-kube-api-access-x9mz2\") pod \"multus-admission-controller-857f4d67dd-hkwfz\" (UID: \"b48a43d4-438d-40a9-8982-74ea7020664a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hkwfz" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.283719 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v6g8h" Feb 24 02:56:35 crc kubenswrapper[4923]: W0224 02:56:35.287602 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60ecc3e7_33b2_4fa2_a53e_a7f9bc5afae6.slice/crio-d78c580305d688f52db9992de1eb87a5c7007c13e2493d536da3b9bb7ed06f60 WatchSource:0}: Error finding container d78c580305d688f52db9992de1eb87a5c7007c13e2493d536da3b9bb7ed06f60: Status 404 returned error can't find the container with id d78c580305d688f52db9992de1eb87a5c7007c13e2493d536da3b9bb7ed06f60 Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.287633 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trp2g" event={"ID":"8ae95859-0a1a-4095-89a8-77ad9197a9e9","Type":"ContainerStarted","Data":"9dcc1d16f7d4f62f9941ca68ea6f589902016d9272e37dc73e15d92291084013"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.288187 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n2zj\" (UniqueName: \"kubernetes.io/projected/34d9b54c-a37d-407b-81c7-ff77a96b7dd8-kube-api-access-7n2zj\") pod \"control-plane-machine-set-operator-78cbb6b69f-rx89l\" (UID: \"34d9b54c-a37d-407b-81c7-ff77a96b7dd8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rx89l" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.290961 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qgk6z" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.291256 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gsv9x" event={"ID":"c5c02b8b-cae8-4e73-9e6f-34a8120f00c2","Type":"ContainerStarted","Data":"11a85fee3e6e7488f00b7cf01062881ecc061500d5316ac713560e4f679d6638"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.296432 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-82dxq" event={"ID":"ae174d22-78c6-4699-9d9b-8ce566dc9f4c","Type":"ContainerStarted","Data":"d65de4632aa09151ac43837d7301c71d4100aa55525fca7b94d95cf0c492725b"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.296482 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-82dxq" event={"ID":"ae174d22-78c6-4699-9d9b-8ce566dc9f4c","Type":"ContainerStarted","Data":"f2ebc7247d02588496af7b441f254e42ed888f95ab1537557d968a40a8014f72"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.297811 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" event={"ID":"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3","Type":"ContainerStarted","Data":"5156548ad67fd644c1606739f4d8eed57d03aec9fedce9fcf97a3d107ed018a6"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.305690 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8rxcd" event={"ID":"348bc8fb-6fa1-40dc-82ca-99683b7e68ed","Type":"ContainerStarted","Data":"1e3d5b94e55a44d55f51ac0505c68fb521a4c0c8ebeef4c19aaf21daf2f9732a"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.310436 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ffdsb" event={"ID":"9605a72a-cc35-4904-a7f3-bbeff4972542","Type":"ContainerStarted","Data":"8edc698ab2f1c08ee8afc3d3c9a0765b51a88964e0e75865d96daed9a76283b1"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.310507 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ffdsb" event={"ID":"9605a72a-cc35-4904-a7f3-bbeff4972542","Type":"ContainerStarted","Data":"db3bcfc69b820f7c3214d1b4875c37e6a6a873fd3aa6a85e5f09e188de6b9166"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.311010 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-ffdsb" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.312014 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" event={"ID":"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29","Type":"ContainerStarted","Data":"fa19150da061f2f9b1629fd59769ef9bd3e08c9b18a9bc2a445362335b970294"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.313046 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" event={"ID":"c1e09aba-671f-45f8-84e1-8a3813b39383","Type":"ContainerStarted","Data":"525fa8b19bebb7abc6dcf69395f36757f8e703dacebaf5718b8976074c703bdd"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.313901 4923 patch_prober.go:28] interesting pod/console-operator-58897d9998-ffdsb container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.313947 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-ffdsb" podUID="9605a72a-cc35-4904-a7f3-bbeff4972542" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.314005 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxmsz" event={"ID":"d69a94d1-60bd-4bd5-90ca-7d7cd50438b6","Type":"ContainerStarted","Data":"f381f0cd9081fea8ef61692e8c09882b769b4896b413fe27433b61a6833c7fb6"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.315842 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx" event={"ID":"9b5ccce4-6412-43a0-bcea-5f88c4b3b47c","Type":"ContainerStarted","Data":"0e749991483ee2f2c32a98ce4d21c67b7e9400f5ee7c5fca19684fceeb3d1e73"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.315869 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx" event={"ID":"9b5ccce4-6412-43a0-bcea-5f88c4b3b47c","Type":"ContainerStarted","Data":"0791aaae83070e80a02a619accef99e087a651aa20d374696bcc48b676b890c6"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.318053 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-866rc" event={"ID":"6c32d4cb-1966-42fd-ba29-1bda2bcced93","Type":"ContainerStarted","Data":"78058c88e0ff50ff19978a9bde0b23a62e4bc1ef478b764fc7e53559a6f5aa05"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.319655 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:35 crc kubenswrapper[4923]: E0224 02:56:35.319998 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:35.819982866 +0000 UTC m=+119.837053679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.321116 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cldx7" event={"ID":"88764e4a-5a09-4223-b9b5-d576d2b36f41","Type":"ContainerStarted","Data":"3306a58a22c2c96591d1235e99362595e490f8338c4e1956a8a128a5d034a21a"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.325906 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" event={"ID":"52bd6986-2737-455e-bb5c-570aa29f4001","Type":"ContainerStarted","Data":"617fc6fc09888204dba99317713bbd675b86e758863814cb07544f2d0d236358"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.327849 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-clfml" event={"ID":"ba11e280-cb6f-45fb-8668-80650a1ad7bc","Type":"ContainerStarted","Data":"b6b589ff0f339b0d4e75ef5afd53510ffe64695e5cf96def787a6eff2e96fdd1"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.329115 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc" event={"ID":"a99e06cc-b200-4073-a847-410f9799eb3a","Type":"ContainerStarted","Data":"e0f30223384b9fe3815bb5416883278d6902c2c962783c41cbcea5c777ec12c7"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.336681 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-w5k6j" event={"ID":"3f89d640-5e7f-473b-98e3-420780c10024","Type":"ContainerStarted","Data":"66bfd904fad9ad79d2916e20c33d6162f4ff403887d264fdaedd1a18493f4a36"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.336741 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-w5k6j" event={"ID":"3f89d640-5e7f-473b-98e3-420780c10024","Type":"ContainerStarted","Data":"97b420f6958520ae51fb9f650c82142cf47dc5d5114f60d98b53ae09964f4031"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.338239 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwpbz" event={"ID":"5401c124-d8da-4335-8cd9-b8afc71fc682","Type":"ContainerStarted","Data":"3e99a98859635cda7036a9fc114eb59de8b89b4e0ec729ebf9be1edadab16f38"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.338269 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwpbz" event={"ID":"5401c124-d8da-4335-8cd9-b8afc71fc682","Type":"ContainerStarted","Data":"012faa509dc867a67c3f253b551b6db5a396f5e0a27c32951819e4f0a0fe2929"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.339033 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" event={"ID":"bf70a70a-8a77-428b-97ca-2609ccc84a26","Type":"ContainerStarted","Data":"36d60c601deef4b5420ebf8d0c0713c70009a1edc3aff6db0470953e51b689d9"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.340087 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq" event={"ID":"99dcd7e1-2d1d-4d12-a279-02c56e1d96c6","Type":"ContainerStarted","Data":"5e263f246a01028cb333721222a0b6e0c992ab19e81c77c4db1a8a0a27c4bf71"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.342201 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" event={"ID":"58b2a3df-daae-45b6-8343-dedec3d3ecce","Type":"ContainerStarted","Data":"15e908fb1c4a2a2d18b9f09f7cb52dc73f7c5ca02388d91f15b758ab0931237b"} Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.405050 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q2s27"] Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.420317 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:35 crc kubenswrapper[4923]: E0224 02:56:35.421403 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:35.921383588 +0000 UTC m=+119.938454401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.497768 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hkwfz" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.505361 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vvpzd" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.519567 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.521794 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:35 crc kubenswrapper[4923]: E0224 02:56:35.522149 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:36.022138153 +0000 UTC m=+120.039208966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.527848 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rx89l" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.568952 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548"] Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.622840 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:35 crc kubenswrapper[4923]: E0224 02:56:35.622967 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:36.122951009 +0000 UTC m=+120.140021822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.623090 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:35 crc kubenswrapper[4923]: E0224 02:56:35.623345 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:36.123336189 +0000 UTC m=+120.140407002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.643752 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2"] Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.723674 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:35 crc kubenswrapper[4923]: E0224 02:56:35.723848 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:36.223821907 +0000 UTC m=+120.240892720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.739222 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.739398 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa5ec917-061e-4f9c-8930-994239908f27-metrics-certs\") pod \"network-metrics-daemon-pl8mp\" (UID: \"fa5ec917-061e-4f9c-8930-994239908f27\") " pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:35 crc kubenswrapper[4923]: E0224 02:56:35.740123 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:36.240107556 +0000 UTC m=+120.257178369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.804575 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa5ec917-061e-4f9c-8930-994239908f27-metrics-certs\") pod \"network-metrics-daemon-pl8mp\" (UID: \"fa5ec917-061e-4f9c-8930-994239908f27\") " pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.841655 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:35 crc kubenswrapper[4923]: E0224 02:56:35.841981 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:36.341929669 +0000 UTC m=+120.359000482 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.842444 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:35 crc kubenswrapper[4923]: E0224 02:56:35.842777 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:36.342763311 +0000 UTC m=+120.359834124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.853169 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kmr2g"] Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.853221 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pjqpk"] Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.868278 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pl8mp" Feb 24 02:56:35 crc kubenswrapper[4923]: I0224 02:56:35.944260 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:35 crc kubenswrapper[4923]: E0224 02:56:35.944613 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:36.444599935 +0000 UTC m=+120.461670748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.046446 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:36 crc kubenswrapper[4923]: E0224 02:56:36.046791 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:36.546776017 +0000 UTC m=+120.563846830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.148156 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:36 crc kubenswrapper[4923]: E0224 02:56:36.148238 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:36.64821998 +0000 UTC m=+120.665290783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.149581 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:36 crc kubenswrapper[4923]: E0224 02:56:36.149840 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:36.649832623 +0000 UTC m=+120.666903436 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.254709 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:36 crc kubenswrapper[4923]: E0224 02:56:36.255570 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:36.755545328 +0000 UTC m=+120.772616151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.273843 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=2.27382507 podStartE2EDuration="2.27382507s" podCreationTimestamp="2026-02-24 02:56:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:36.271775966 +0000 UTC m=+120.288846779" watchObservedRunningTime="2026-02-24 02:56:36.27382507 +0000 UTC m=+120.290895883" Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.357999 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.373074 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pjqpk" event={"ID":"81743785-297e-4cbd-b84e-de806ada8d8f","Type":"ContainerStarted","Data":"b3d783847a5501f18f3c63a654cd477e98bc2d53c48bb31caf03bbc882473408"} Feb 24 02:56:36 crc kubenswrapper[4923]: E0224 02:56:36.384505 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:36.884489136 +0000 UTC m=+120.901559949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.395345 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trp2g" event={"ID":"8ae95859-0a1a-4095-89a8-77ad9197a9e9","Type":"ContainerStarted","Data":"3552d803403164560980117fedff8d959941413ec36308a4d006169153f1396d"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.400235 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7qxgw"] Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.411048 4923 generic.go:334] "Generic (PLEG): container finished" podID="ba11e280-cb6f-45fb-8668-80650a1ad7bc" containerID="1e183eb5314f30f8c6da3e2eb2b3408042aefd4d6b0392892f48be4dec484088" exitCode=0 Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.411130 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-clfml" event={"ID":"ba11e280-cb6f-45fb-8668-80650a1ad7bc","Type":"ContainerDied","Data":"1e183eb5314f30f8c6da3e2eb2b3408042aefd4d6b0392892f48be4dec484088"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.412666 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9jk4q"] Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.427021 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc" event={"ID":"a99e06cc-b200-4073-a847-410f9799eb3a","Type":"ContainerStarted","Data":"a76c0488461ed909bb928cc0aadfa91454b9c0426b9fb3c1fef88d255d4142a4"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.431247 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.449480 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b743a6362f655b21457e1b2fb383dd1ef637e82c5ec68b7c2768e5d371d2be7f"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.449846 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.458974 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:36 crc kubenswrapper[4923]: E0224 02:56:36.461034 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:36.961016892 +0000 UTC m=+120.978087705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.463137 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q2s27" event={"ID":"e59578f2-07d5-4eb9-8b58-22a2b4f73a3b","Type":"ContainerStarted","Data":"3bc690e2678dd73bf0ad408c05983238c4f6fff6449b048783a9c6f612692e37"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.465075 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2" event={"ID":"e7391920-399a-4773-bc06-5b62900f1206","Type":"ContainerStarted","Data":"16e8ce76b13641a53364cec453e55653b9002f76c123175c64491f9f611f20ea"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.468190 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cldx7" event={"ID":"88764e4a-5a09-4223-b9b5-d576d2b36f41","Type":"ContainerStarted","Data":"d7d97df6a6cd24b24dcaf31712caaf47a68aa452c3068210edd6ec8e23dca9e5"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.471804 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" event={"ID":"c1e09aba-671f-45f8-84e1-8a3813b39383","Type":"ContainerStarted","Data":"bd8c7489dad86369d73a908827a414a89202419efe30a42375a5b28fba78c462"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.474423 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gsv9x" event={"ID":"c5c02b8b-cae8-4e73-9e6f-34a8120f00c2","Type":"ContainerStarted","Data":"dfbda8413af58aa7c1ab27f4335128da179695a3415360f906063d5cdf4f4099"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.474946 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-gsv9x" Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.477883 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qgk6z" event={"ID":"a210d674-dc51-4a19-8172-a76c264f6f0d","Type":"ContainerStarted","Data":"f93f248f1097f0af3d3e4075dfa320746ba52e1d8a8af554e42cc413759aaf36"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.483703 4923 patch_prober.go:28] interesting pod/downloads-7954f5f757-gsv9x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.483748 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gsv9x" podUID="c5c02b8b-cae8-4e73-9e6f-34a8120f00c2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 24 02:56:36 crc kubenswrapper[4923]: W0224 02:56:36.510785 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc62f6b93_9a26_4b21_a8b9_d86b7ab4e75c.slice/crio-42224d5424f1808e1d1fb142294775f511e4d3eeab65c1d3ef835a10aaa4392b WatchSource:0}: Error finding container 42224d5424f1808e1d1fb142294775f511e4d3eeab65c1d3ef835a10aaa4392b: Status 404 returned error can't find the container with id 42224d5424f1808e1d1fb142294775f511e4d3eeab65c1d3ef835a10aaa4392b Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.525609 4923 generic.go:334] "Generic (PLEG): container finished" podID="bf70a70a-8a77-428b-97ca-2609ccc84a26" containerID="38eb9f2e71bda1cd5fccc83638c674664263135fc28932847e1312852a7e3542" exitCode=0 Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.525699 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" event={"ID":"bf70a70a-8a77-428b-97ca-2609ccc84a26","Type":"ContainerDied","Data":"38eb9f2e71bda1cd5fccc83638c674664263135fc28932847e1312852a7e3542"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.530629 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" event={"ID":"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3","Type":"ContainerStarted","Data":"722b2831c7bf46342a36cc1f84d64be8d42bebbeb7af8ed6923de47bad87a84b"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.531411 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.534215 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" event={"ID":"52bd6986-2737-455e-bb5c-570aa29f4001","Type":"ContainerStarted","Data":"d3825e72b2abcd561484d42a93fa8257b5e2d2b1e2394370c847bf6428dc9b0a"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.540261 4923 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-k2q5j container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.540313 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" podUID="0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.543504 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" event={"ID":"5b6bae49-7ab6-4aab-bed2-8e6507bc798a","Type":"ContainerStarted","Data":"a6158bb003b1aa500d8f67f11e5df1afe366bf26337e8b752c84e047fa6ea7cc"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.557325 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" event={"ID":"30e1ab71-a068-4593-9dc7-f1f7731caeb9","Type":"ContainerStarted","Data":"84b23c7581da947b6bdb888347cef89e61e753e150cf52164db9e3aefcc56858"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.565530 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:36 crc kubenswrapper[4923]: E0224 02:56:36.565897 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:37.065882325 +0000 UTC m=+121.082953138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.584220 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq" event={"ID":"99dcd7e1-2d1d-4d12-a279-02c56e1d96c6","Type":"ContainerStarted","Data":"22a2ad3fe209cd5cc8a65e39c1b84221a93d168ade3cb96ffb1da4ae5252e682"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.587262 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8rxcd" event={"ID":"348bc8fb-6fa1-40dc-82ca-99683b7e68ed","Type":"ContainerStarted","Data":"9348a92342e86d4ca970f601350716e1770e6c391d171aebe295dcda9cf6ba4a"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.595959 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwpbz" event={"ID":"5401c124-d8da-4335-8cd9-b8afc71fc682","Type":"ContainerStarted","Data":"9db6d31e165dd958fa2f2f5370ccb1c8e5b6e354a9ddc5a4c44eb7c9bff8fbc1"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.599762 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j" event={"ID":"60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6","Type":"ContainerStarted","Data":"34e5091771224c073b66c76363ca7657af0e9429006db2f6aa07cc5082c69b80"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.599800 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j" event={"ID":"60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6","Type":"ContainerStarted","Data":"d78c580305d688f52db9992de1eb87a5c7007c13e2493d536da3b9bb7ed06f60"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.606344 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-82dxq" event={"ID":"ae174d22-78c6-4699-9d9b-8ce566dc9f4c","Type":"ContainerStarted","Data":"11a00be60fc448744f1cfe960e525a8af3ce1a7b5fb74254fe2ba0cc0867f012"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.607700 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kmr2g" event={"ID":"1ea1c9d1-2053-4519-a342-979703b00a41","Type":"ContainerStarted","Data":"c5e635a766e70985d8daa08a823bbe0cf82a764a9b538d0716a4c4b029b635c6"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.608691 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ctnr7" event={"ID":"10a4e000-d2a9-455f-a7a7-ae4d90611c29","Type":"ContainerStarted","Data":"90803a366eb953356d7a6d9e4e2a9499ed48e0a254aa639ecb17d34795b9792f"} Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.667009 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:36 crc kubenswrapper[4923]: E0224 02:56:36.667107 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:37.167093492 +0000 UTC m=+121.184164305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.669784 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:36 crc kubenswrapper[4923]: E0224 02:56:36.675888 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:37.175856423 +0000 UTC m=+121.192927316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.771075 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:36 crc kubenswrapper[4923]: E0224 02:56:36.774260 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:37.274228365 +0000 UTC m=+121.291299178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.782105 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:36 crc kubenswrapper[4923]: E0224 02:56:36.785290 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:37.285270946 +0000 UTC m=+121.302341759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.799864 4923 csr.go:261] certificate signing request csr-nx6j5 is approved, waiting to be issued Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.815121 4923 csr.go:257] certificate signing request csr-nx6j5 is issued Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.821900 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-ffdsb" Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.885779 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:36 crc kubenswrapper[4923]: E0224 02:56:36.886290 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:37.386271578 +0000 UTC m=+121.403342391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.890049 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-x278g"] Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.915514 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5"] Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.940612 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v6g8h"] Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.955031 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-82dxq" podStartSLOduration=44.955006669 podStartE2EDuration="44.955006669s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:36.947807529 +0000 UTC m=+120.964878332" watchObservedRunningTime="2026-02-24 02:56:36.955006669 +0000 UTC m=+120.972077482" Feb 24 02:56:36 crc kubenswrapper[4923]: W0224 02:56:36.974779 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2e3c432_2e8f_4e84_9ae3_c08b02dea7d8.slice/crio-753b5950268974b6ec07bfc96c06fe03ae6dd1ecb987e7cee882867d1c75d953 WatchSource:0}: Error finding container 753b5950268974b6ec07bfc96c06fe03ae6dd1ecb987e7cee882867d1c75d953: Status 404 returned error can't find the container with id 753b5950268974b6ec07bfc96c06fe03ae6dd1ecb987e7cee882867d1c75d953 Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.978411 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-74xcz"] Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.987450 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:36 crc kubenswrapper[4923]: E0224 02:56:36.987885 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:37.487869915 +0000 UTC m=+121.504940728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:36 crc kubenswrapper[4923]: I0224 02:56:36.995904 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jdnlf" podStartSLOduration=45.995888556 podStartE2EDuration="45.995888556s" podCreationTimestamp="2026-02-24 02:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:36.995325441 +0000 UTC m=+121.012396264" watchObservedRunningTime="2026-02-24 02:56:36.995888556 +0000 UTC m=+121.012959369" Feb 24 02:56:37 crc kubenswrapper[4923]: W0224 02:56:37.050075 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b13000e_92ec_4de6_9e62_9232504476b4.slice/crio-fb8a9c96398925f6497750d9d7cb21b907d4382319f8fa8fffe871ca59139f6b WatchSource:0}: Error finding container fb8a9c96398925f6497750d9d7cb21b907d4382319f8fa8fffe871ca59139f6b: Status 404 returned error can't find the container with id fb8a9c96398925f6497750d9d7cb21b907d4382319f8fa8fffe871ca59139f6b Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.051400 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-pzbwf" podStartSLOduration=45.051366308 podStartE2EDuration="45.051366308s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:37.050856104 +0000 UTC m=+121.067926917" watchObservedRunningTime="2026-02-24 02:56:37.051366308 +0000 UTC m=+121.068437121" Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.092867 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:37 crc kubenswrapper[4923]: E0224 02:56:37.093651 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:37.593636542 +0000 UTC m=+121.610707355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.104427 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-w6rtx" podStartSLOduration=46.104412306 podStartE2EDuration="46.104412306s" podCreationTimestamp="2026-02-24 02:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:37.082359025 +0000 UTC m=+121.099429838" watchObservedRunningTime="2026-02-24 02:56:37.104412306 +0000 UTC m=+121.121483119" Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.105154 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s"] Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.125306 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h"] Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.143440 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.143423954 podStartE2EDuration="14.143423954s" podCreationTimestamp="2026-02-24 02:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:37.142217832 +0000 UTC m=+121.159288645" watchObservedRunningTime="2026-02-24 02:56:37.143423954 +0000 UTC m=+121.160494767" Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.143765 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-96lgm"] Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.181657 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-292s5"] Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.188952 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8rxcd" podStartSLOduration=45.188934103 podStartE2EDuration="45.188934103s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:37.186639712 +0000 UTC m=+121.203710525" watchObservedRunningTime="2026-02-24 02:56:37.188934103 +0000 UTC m=+121.206004916" Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.194957 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:37 crc kubenswrapper[4923]: E0224 02:56:37.195324 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:37.695310571 +0000 UTC m=+121.712381384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.195630 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vvpzd"] Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.211034 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pl8mp"] Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.249071 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-ffdsb" podStartSLOduration=45.249053747 podStartE2EDuration="45.249053747s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:37.224001877 +0000 UTC m=+121.241072690" watchObservedRunningTime="2026-02-24 02:56:37.249053747 +0000 UTC m=+121.266124550" Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.296043 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:37 crc kubenswrapper[4923]: E0224 02:56:37.305922 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:37.796494727 +0000 UTC m=+121.813565540 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.322655 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bgtwq" podStartSLOduration=45.322636616 podStartE2EDuration="45.322636616s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:37.31636304 +0000 UTC m=+121.333433853" watchObservedRunningTime="2026-02-24 02:56:37.322636616 +0000 UTC m=+121.339707419" Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.323866 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" podStartSLOduration=45.323858018 podStartE2EDuration="45.323858018s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:37.248222125 +0000 UTC m=+121.265292938" watchObservedRunningTime="2026-02-24 02:56:37.323858018 +0000 UTC m=+121.340928831" Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.330952 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rx89l"] Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.351877 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hkwfz"] Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.401689 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:37 crc kubenswrapper[4923]: E0224 02:56:37.402082 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:37.902066949 +0000 UTC m=+121.919137762 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.433184 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-w5k6j" podStartSLOduration=45.433170598 podStartE2EDuration="45.433170598s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:37.431528555 +0000 UTC m=+121.448599358" watchObservedRunningTime="2026-02-24 02:56:37.433170598 +0000 UTC m=+121.450241411" Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.467076 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc" podStartSLOduration=45.467059311 podStartE2EDuration="45.467059311s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:37.466188508 +0000 UTC m=+121.483259321" watchObservedRunningTime="2026-02-24 02:56:37.467059311 +0000 UTC m=+121.484130114" Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.494098 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-gsv9x" podStartSLOduration=45.494079433 podStartE2EDuration="45.494079433s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:37.493849537 +0000 UTC m=+121.510920350" watchObservedRunningTime="2026-02-24 02:56:37.494079433 +0000 UTC m=+121.511150246" Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.503838 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:37 crc kubenswrapper[4923]: E0224 02:56:37.504179 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:38.004165719 +0000 UTC m=+122.021236532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.562205 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cldx7" podStartSLOduration=46.562183388 podStartE2EDuration="46.562183388s" podCreationTimestamp="2026-02-24 02:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:37.557512535 +0000 UTC m=+121.574583348" watchObservedRunningTime="2026-02-24 02:56:37.562183388 +0000 UTC m=+121.579254201" Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.587770 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gwpbz" podStartSLOduration=45.587751132 podStartE2EDuration="45.587751132s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:37.586386566 +0000 UTC m=+121.603457379" watchObservedRunningTime="2026-02-24 02:56:37.587751132 +0000 UTC m=+121.604821945" Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.620007 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:37 crc kubenswrapper[4923]: E0224 02:56:37.620276 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:38.120264828 +0000 UTC m=+122.137335641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.663834 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-trp2g" podStartSLOduration=45.663816096 podStartE2EDuration="45.663816096s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:37.662028449 +0000 UTC m=+121.679099302" watchObservedRunningTime="2026-02-24 02:56:37.663816096 +0000 UTC m=+121.680886909" Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.679026 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h" event={"ID":"436a7a59-c116-4383-b580-19d167e74eeb","Type":"ContainerStarted","Data":"47eaf930c303e15d0d9ddb057bf1bba3ba09c92dead097a2220e4b502de412fb"} Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.703676 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" event={"ID":"b66c0222-b5e0-4d1e-841e-507c8e61e482","Type":"ContainerStarted","Data":"6d99c7cb2f5a28862baa88829363b97fde2cabc13a9ca38c65e6cf7079139f86"} Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.721844 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:37 crc kubenswrapper[4923]: E0224 02:56:37.722150 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:38.222137103 +0000 UTC m=+122.239207916 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.793614 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.793657 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" event={"ID":"30e1ab71-a068-4593-9dc7-f1f7731caeb9","Type":"ContainerStarted","Data":"1a1da2c6ea0d1ed6fc17482ec19f3efcc2ae40122953a75eb0c2f51c01ec110f"} Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.793675 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q2s27" event={"ID":"e59578f2-07d5-4eb9-8b58-22a2b4f73a3b","Type":"ContainerStarted","Data":"d9a570391008dd52d31c7a5562bd658636a5ce304917cab6eeecc28247c0004c"} Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.803690 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v6g8h" event={"ID":"93648d39-0b9e-4eda-9f09-a299571f3259","Type":"ContainerStarted","Data":"b55919c79b7529951282203d6eebcc683e4397c3211c16f1aa2c5c7562aff124"} Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.803749 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v6g8h" event={"ID":"93648d39-0b9e-4eda-9f09-a299571f3259","Type":"ContainerStarted","Data":"35e670a16ff48e6e3cdb9289e510f277f5eb5344a7ac6e2a805ecd8e78752d18"} Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.813679 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-96lgm" event={"ID":"ea015236-90a9-4feb-bc5a-800e9831fcd6","Type":"ContainerStarted","Data":"29cf59ec409d97a9c8fab7c7767dd7056a7905173968ce95172402868fb12d80"} Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.822888 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:37 crc kubenswrapper[4923]: E0224 02:56:37.823250 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:38.323237987 +0000 UTC m=+122.340308800 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.824811 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k2q5j"] Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.826136 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" event={"ID":"bf70a70a-8a77-428b-97ca-2609ccc84a26","Type":"ContainerStarted","Data":"d07900714dd48755f154844970710c93a49a648227598f27fdf2f8f80499dad0"} Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.836198 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 02:51:36 +0000 UTC, rotation deadline is 2026-11-17 20:58:11.494978373 +0000 UTC Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.836249 4923 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6402h1m33.658731134s for next certificate rotation Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.842938 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ctnr7" event={"ID":"10a4e000-d2a9-455f-a7a7-ae4d90611c29","Type":"ContainerStarted","Data":"f0a00db466c1de56ab40513f6476637dedb40220d03e9f0efc00ee4dd13af35a"} Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.846661 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hkwfz" event={"ID":"b48a43d4-438d-40a9-8982-74ea7020664a","Type":"ContainerStarted","Data":"aa37e724e368206c86652ca0cb61929cb8ea9bcd69b0f8ab5ee62b7242dcfa0f"} Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.872985 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548"] Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.889667 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pjqpk" event={"ID":"81743785-297e-4cbd-b84e-de806ada8d8f","Type":"ContainerStarted","Data":"5663a410fe586a6cc46d1990af0fed66dadc66eb2910f1b05b6ebe5179e608b1"} Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.894961 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pl8mp" event={"ID":"fa5ec917-061e-4f9c-8930-994239908f27","Type":"ContainerStarted","Data":"d624b283cb24e0248d2d46cc0b601f1bbbb737a16f809ab0240bf93b83b1fc35"} Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.924087 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:37 crc kubenswrapper[4923]: E0224 02:56:37.924387 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:38.424359321 +0000 UTC m=+122.441430134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.924627 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:37 crc kubenswrapper[4923]: E0224 02:56:37.925420 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:38.425408519 +0000 UTC m=+122.442479332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.962568 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x278g" event={"ID":"8e054d71-6ccc-4ab7-905a-5b1053bdbdc7","Type":"ContainerStarted","Data":"51c7599ef30ae6d8f73fbd017f2604e09fb6a408fcc139e7c86f5475d1b288ee"} Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.962607 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x278g" event={"ID":"8e054d71-6ccc-4ab7-905a-5b1053bdbdc7","Type":"ContainerStarted","Data":"3427e04032667bb6ad5a13cab360debb4764a574d496cfeecac71f02adc9d383"} Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.979690 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" event={"ID":"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29","Type":"ContainerStarted","Data":"a84cb03f106753b6f0af813959423454f8e1546e9d2293bb47485348ae4a345e"} Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.980524 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.996340 4923 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-lfds7 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Feb 24 02:56:37 crc kubenswrapper[4923]: I0224 02:56:37.996400 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" podUID="fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.001918 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-866rc" event={"ID":"6c32d4cb-1966-42fd-ba29-1bda2bcced93","Type":"ContainerStarted","Data":"6cbe5761a5cc9cb74fddfd6379fdc67f4c2e736497a1371fab32dd54aba51263"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.026886 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:38 crc kubenswrapper[4923]: E0224 02:56:38.027702 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:38.527680824 +0000 UTC m=+122.544751637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.045410 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rx89l" event={"ID":"34d9b54c-a37d-407b-81c7-ff77a96b7dd8","Type":"ContainerStarted","Data":"b5a08bc8bbcf2d45c226484408009e042993aa1b94a8d8133283203d874925dc"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.063403 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-74xcz" event={"ID":"4b13000e-92ec-4de6-9e62-9232504476b4","Type":"ContainerStarted","Data":"fb8a9c96398925f6497750d9d7cb21b907d4382319f8fa8fffe871ca59139f6b"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.085193 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qgk6z" event={"ID":"a210d674-dc51-4a19-8172-a76c264f6f0d","Type":"ContainerStarted","Data":"eb2f1018061abdfb2c9f135550b03eae3183136655a9a5fe171e92d72abb5a2c"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.132450 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:38 crc kubenswrapper[4923]: E0224 02:56:38.132772 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:38.632759522 +0000 UTC m=+122.649830335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.147700 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kmr2g" event={"ID":"1ea1c9d1-2053-4519-a342-979703b00a41","Type":"ContainerStarted","Data":"bdc9edf71ed2d6ceb4cd9f5bd625910b3437aefc5f7d8d585f7602fae7728398"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.154924 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.158634 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j" event={"ID":"60ecc3e7-33b2-4fa2-a53e-a7f9bc5afae6","Type":"ContainerStarted","Data":"141b95824e644ff3f7efe25e19693d9c8e6a3500cc425bc65f4a678d7a9e4ee6"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.161241 4923 patch_prober.go:28] interesting pod/router-default-5444994796-ctnr7 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.161309 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ctnr7" podUID="10a4e000-d2a9-455f-a7a7-ae4d90611c29" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.194143 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" event={"ID":"5b6bae49-7ab6-4aab-bed2-8e6507bc798a","Type":"ContainerStarted","Data":"e19942ab5d5b6fad40c33c64990a9f107902231ad7f089b877604ce2a1a4b392"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.194772 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.231129 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7qxgw" event={"ID":"5417593c-9d13-4adf-8579-c7e9331d6aa8","Type":"ContainerStarted","Data":"7f8b284ed394313b6272432cec1aff855b793ee6f008234793fb746e939adf8b"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.231186 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7qxgw" event={"ID":"5417593c-9d13-4adf-8579-c7e9331d6aa8","Type":"ContainerStarted","Data":"842694544e9dc5e09069a426d8715be11cb05f2ff5820aa28d2fa0b5bfc3b6b1"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.311760 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:38 crc kubenswrapper[4923]: E0224 02:56:38.312151 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:38.812137048 +0000 UTC m=+122.829207861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.318590 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vvpzd" event={"ID":"9dcee5b9-ae18-4145-9af2-c4ebf1e36294","Type":"ContainerStarted","Data":"cbb09614ad81e0a103a8d175d55cbd3bcdd70b91b7feac3b43fcaef490d938e4"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.371706 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" event={"ID":"b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8","Type":"ContainerStarted","Data":"6e1fc04310a24eafb112b5f650d9848ccda7556a1837993013e55c1bd452b35a"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.371752 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" event={"ID":"b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8","Type":"ContainerStarted","Data":"753b5950268974b6ec07bfc96c06fe03ae6dd1ecb987e7cee882867d1c75d953"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.372019 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.386563 4923 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6txs5 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" start-of-body= Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.386859 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" podUID="b2e3c432-2e8f-4e84-9ae3-c08b02dea7d8" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.388316 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2" event={"ID":"e7391920-399a-4773-bc06-5b62900f1206","Type":"ContainerStarted","Data":"5a490a8922ab848ddadb171120adbcd56dc979e52450a034ce48996efd2940fe"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.388824 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.416648 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxmsz" event={"ID":"d69a94d1-60bd-4bd5-90ca-7d7cd50438b6","Type":"ContainerStarted","Data":"a016e4d2a94c8cdb3119fc5e62df7f04f0059b6462f371d68eafda0adff703ef"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.430918 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:38 crc kubenswrapper[4923]: E0224 02:56:38.433382 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:38.933366893 +0000 UTC m=+122.950437706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.468083 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9jk4q" event={"ID":"c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c","Type":"ContainerStarted","Data":"6161a11e7df2a2a748ba83a9d14a465e755d2942304bea3dead85540e926ae57"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.468198 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9jk4q" event={"ID":"c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c","Type":"ContainerStarted","Data":"42224d5424f1808e1d1fb142294775f511e4d3eeab65c1d3ef835a10aaa4392b"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.469951 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.527206 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-clfml" event={"ID":"ba11e280-cb6f-45fb-8668-80650a1ad7bc","Type":"ContainerStarted","Data":"254da50f28da8f9099c1c8302b6f4b55178cc192118f75037c46b1ab057e7f78"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.528398 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-clfml" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.535361 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:38 crc kubenswrapper[4923]: E0224 02:56:38.536659 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:39.036643934 +0000 UTC m=+123.053714747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.541583 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s" event={"ID":"e47a3768-fc24-4a98-8ed4-2264127d71cd","Type":"ContainerStarted","Data":"861e7496f2227fa0eaaed579f17394169c8c15dcad0e42b2d9c44e55becacdcf"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.592418 4923 generic.go:334] "Generic (PLEG): container finished" podID="58b2a3df-daae-45b6-8343-dedec3d3ecce" containerID="3681ac346a7746de7bb0aeaafd49d2a5d15649817e1b80aff2a8957b364f962d" exitCode=0 Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.592656 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" event={"ID":"58b2a3df-daae-45b6-8343-dedec3d3ecce","Type":"ContainerDied","Data":"3681ac346a7746de7bb0aeaafd49d2a5d15649817e1b80aff2a8957b364f962d"} Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.593486 4923 patch_prober.go:28] interesting pod/downloads-7954f5f757-gsv9x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.593526 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gsv9x" podUID="c5c02b8b-cae8-4e73-9e6f-34a8120f00c2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.619115 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" podStartSLOduration=46.619098887 podStartE2EDuration="46.619098887s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:38.563681506 +0000 UTC m=+122.580752319" watchObservedRunningTime="2026-02-24 02:56:38.619098887 +0000 UTC m=+122.636169700" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.632834 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.638030 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:38 crc kubenswrapper[4923]: E0224 02:56:38.638657 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:39.138640781 +0000 UTC m=+123.155711594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.686921 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" podStartSLOduration=6.686902943 podStartE2EDuration="6.686902943s" podCreationTimestamp="2026-02-24 02:56:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:38.629922272 +0000 UTC m=+122.646993075" watchObservedRunningTime="2026-02-24 02:56:38.686902943 +0000 UTC m=+122.703973756" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.687957 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pjqpk" podStartSLOduration=46.687953561 podStartE2EDuration="46.687953561s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:38.686116452 +0000 UTC m=+122.703187265" watchObservedRunningTime="2026-02-24 02:56:38.687953561 +0000 UTC m=+122.705024374" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.711979 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-866rc" podStartSLOduration=46.711963523 podStartE2EDuration="46.711963523s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:38.710483684 +0000 UTC m=+122.727554487" watchObservedRunningTime="2026-02-24 02:56:38.711963523 +0000 UTC m=+122.729034336" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.739409 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:38 crc kubenswrapper[4923]: E0224 02:56:38.741476 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:39.24144058 +0000 UTC m=+123.258511393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.777264 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-7qxgw" podStartSLOduration=46.777246014 podStartE2EDuration="46.777246014s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:38.773764672 +0000 UTC m=+122.790835485" watchObservedRunningTime="2026-02-24 02:56:38.777246014 +0000 UTC m=+122.794316827" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.843114 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:38 crc kubenswrapper[4923]: E0224 02:56:38.843438 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:39.343427078 +0000 UTC m=+123.360497891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.913615 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" podStartSLOduration=47.913597416 podStartE2EDuration="47.913597416s" podCreationTimestamp="2026-02-24 02:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:38.857911219 +0000 UTC m=+122.874982032" watchObservedRunningTime="2026-02-24 02:56:38.913597416 +0000 UTC m=+122.930668229" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.922322 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.943545 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:38 crc kubenswrapper[4923]: E0224 02:56:38.943822 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:39.443806703 +0000 UTC m=+123.460877516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.989556 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jwtm2" podStartSLOduration=46.989541608 podStartE2EDuration="46.989541608s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:38.988973113 +0000 UTC m=+123.006043916" watchObservedRunningTime="2026-02-24 02:56:38.989541608 +0000 UTC m=+123.006612421" Feb 24 02:56:38 crc kubenswrapper[4923]: I0224 02:56:38.998966 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-ctnr7" podStartSLOduration=46.998947895 podStartE2EDuration="46.998947895s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:38.915607429 +0000 UTC m=+122.932678242" watchObservedRunningTime="2026-02-24 02:56:38.998947895 +0000 UTC m=+123.016018708" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.028369 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kmr2g" podStartSLOduration=47.02835333 podStartE2EDuration="47.02835333s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:39.027677423 +0000 UTC m=+123.044748236" watchObservedRunningTime="2026-02-24 02:56:39.02835333 +0000 UTC m=+123.045424143" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.046015 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:39 crc kubenswrapper[4923]: E0224 02:56:39.046430 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:39.546416106 +0000 UTC m=+123.563486919 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.082259 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-qgk6z" podStartSLOduration=7.0822409 podStartE2EDuration="7.0822409s" podCreationTimestamp="2026-02-24 02:56:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:39.079841347 +0000 UTC m=+123.096912160" watchObservedRunningTime="2026-02-24 02:56:39.0822409 +0000 UTC m=+123.099311713" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.108842 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xxmsz" podStartSLOduration=47.108826261 podStartE2EDuration="47.108826261s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:39.107029773 +0000 UTC m=+123.124100586" watchObservedRunningTime="2026-02-24 02:56:39.108826261 +0000 UTC m=+123.125897064" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.147178 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" podStartSLOduration=47.147162271 podStartE2EDuration="47.147162271s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:39.143444403 +0000 UTC m=+123.160515216" watchObservedRunningTime="2026-02-24 02:56:39.147162271 +0000 UTC m=+123.164233084" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.157769 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:39 crc kubenswrapper[4923]: E0224 02:56:39.158106 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:39.658092199 +0000 UTC m=+123.675163012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.159452 4923 patch_prober.go:28] interesting pod/router-default-5444994796-ctnr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 02:56:39 crc kubenswrapper[4923]: [-]has-synced failed: reason withheld Feb 24 02:56:39 crc kubenswrapper[4923]: [+]process-running ok Feb 24 02:56:39 crc kubenswrapper[4923]: healthz check failed Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.159495 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ctnr7" podUID="10a4e000-d2a9-455f-a7a7-ae4d90611c29" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.196575 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gvr8j" podStartSLOduration=47.196558313 podStartE2EDuration="47.196558313s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:39.196032539 +0000 UTC m=+123.213103352" watchObservedRunningTime="2026-02-24 02:56:39.196558313 +0000 UTC m=+123.213629126" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.240067 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-v6g8h" podStartSLOduration=7.240047518 podStartE2EDuration="7.240047518s" podCreationTimestamp="2026-02-24 02:56:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:39.239595017 +0000 UTC m=+123.256665840" watchObservedRunningTime="2026-02-24 02:56:39.240047518 +0000 UTC m=+123.257118331" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.259184 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:39 crc kubenswrapper[4923]: E0224 02:56:39.259522 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:39.759507641 +0000 UTC m=+123.776578454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.360020 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:39 crc kubenswrapper[4923]: E0224 02:56:39.360355 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:39.860336638 +0000 UTC m=+123.877407451 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.360674 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:39 crc kubenswrapper[4923]: E0224 02:56:39.361040 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:39.861028956 +0000 UTC m=+123.878099769 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.467829 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:39 crc kubenswrapper[4923]: E0224 02:56:39.468730 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:39.968705324 +0000 UTC m=+123.985776137 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.569398 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:39 crc kubenswrapper[4923]: E0224 02:56:39.569853 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:40.069835058 +0000 UTC m=+124.086905871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.607900 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pl8mp" event={"ID":"fa5ec917-061e-4f9c-8930-994239908f27","Type":"ContainerStarted","Data":"bbdf7f86b497e76f6027c714c7997b657b7e29393af2b9ff3627e22736b51682"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.607948 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pl8mp" event={"ID":"fa5ec917-061e-4f9c-8930-994239908f27","Type":"ContainerStarted","Data":"a0efafe71923009166c3cb7ba9b90f5f7e8423c41d0cfdb6b22b4674eff4291f"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.620740 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x278g" event={"ID":"8e054d71-6ccc-4ab7-905a-5b1053bdbdc7","Type":"ContainerStarted","Data":"c6ae9c3d71459018fa9b5e9aee0aafc095613d1e1bc441145964eaaa0a7f5764"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.630985 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-clfml" podStartSLOduration=47.630965239 podStartE2EDuration="47.630965239s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:39.408339953 +0000 UTC m=+123.425410766" watchObservedRunningTime="2026-02-24 02:56:39.630965239 +0000 UTC m=+123.648036052" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.632791 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-pl8mp" podStartSLOduration=47.632786307 podStartE2EDuration="47.632786307s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:39.630417845 +0000 UTC m=+123.647488658" watchObservedRunningTime="2026-02-24 02:56:39.632786307 +0000 UTC m=+123.649857120" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.637488 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-74xcz" event={"ID":"4b13000e-92ec-4de6-9e62-9232504476b4","Type":"ContainerStarted","Data":"4e175cd730f8bbb825954e6504f0eb798e497d365cb61e0ea2214815c64147e9"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.637538 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-74xcz" event={"ID":"4b13000e-92ec-4de6-9e62-9232504476b4","Type":"ContainerStarted","Data":"c6b8ee86331c551d00772d17e4d5a71af23d8e9e1255fa62030cba0df49191f3"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.637634 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-74xcz" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.651349 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" event={"ID":"58b2a3df-daae-45b6-8343-dedec3d3ecce","Type":"ContainerStarted","Data":"06d1536ae73a5132545a12832b0c1705f1549cfe144aa1a48814283babdd89af"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.670244 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:39 crc kubenswrapper[4923]: E0224 02:56:39.670413 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:40.170389638 +0000 UTC m=+124.187460451 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.670544 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:39 crc kubenswrapper[4923]: E0224 02:56:39.670841 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:40.17083008 +0000 UTC m=+124.187900893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.675223 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" event={"ID":"bf70a70a-8a77-428b-97ca-2609ccc84a26","Type":"ContainerStarted","Data":"52fe026c96dbcb600375c5beb8b5890c360eb3403f31e2af6e4fb922f87900f4"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.688764 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rx89l" event={"ID":"34d9b54c-a37d-407b-81c7-ff77a96b7dd8","Type":"ContainerStarted","Data":"31111703071ebd8415cd5511a53385f1d81d689cd1e5d5b56c22b288a4cefc1e"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.697284 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h" event={"ID":"436a7a59-c116-4383-b580-19d167e74eeb","Type":"ContainerStarted","Data":"5ab8f6d928d0ebbf94644cd91b69c8144db5974aa2ca393e0c42e54013888c71"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.698257 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.703685 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vvpzd" event={"ID":"9dcee5b9-ae18-4145-9af2-c4ebf1e36294","Type":"ContainerStarted","Data":"9f32b1b94e5da0cb91a9132f36b536dfb6659546d0ca69157918b150a030fe7a"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.703729 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vvpzd" event={"ID":"9dcee5b9-ae18-4145-9af2-c4ebf1e36294","Type":"ContainerStarted","Data":"868bdfbe4dd97afd49fa5fec0dc09eb70c452b59b1090325b406de8f29760148"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.705527 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" event={"ID":"b66c0222-b5e0-4d1e-841e-507c8e61e482","Type":"ContainerStarted","Data":"06fe2428a9f0819074ea52e68d38814f9a286bad2d55f72959e99ae3eb07e8d0"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.706140 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.706874 4923 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-292s5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.706924 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" podUID="b66c0222-b5e0-4d1e-841e-507c8e61e482" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.708352 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-96lgm" event={"ID":"ea015236-90a9-4feb-bc5a-800e9831fcd6","Type":"ContainerStarted","Data":"5b048680e39c79c50391a549cf1f0fa76fc47195c8701ab28ade079dd8b78f50"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.715619 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-x278g" podStartSLOduration=47.715599789 podStartE2EDuration="47.715599789s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:39.680229507 +0000 UTC m=+123.697300320" watchObservedRunningTime="2026-02-24 02:56:39.715599789 +0000 UTC m=+123.732670602" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.734686 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.734716 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s" event={"ID":"e47a3768-fc24-4a98-8ed4-2264127d71cd","Type":"ContainerStarted","Data":"57bb97188380fb98cdfa89c4cc8dbcf51742b8dd4e47c4f3791c5a24d5f1c380"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.734732 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s" event={"ID":"e47a3768-fc24-4a98-8ed4-2264127d71cd","Type":"ContainerStarted","Data":"cfb9c3e5683d1a6582578760abd771a424433e49471f044230ee9c9f5c9bfc41"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.750402 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q2s27" event={"ID":"e59578f2-07d5-4eb9-8b58-22a2b4f73a3b","Type":"ContainerStarted","Data":"57ee0c47c7654a13eb14e0a47a95cff46a5d3ea622c2c17e81528c539e485271"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.755281 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hkwfz" event={"ID":"b48a43d4-438d-40a9-8982-74ea7020664a","Type":"ContainerStarted","Data":"91229f825580ab06f2d0164c989f69363638df96dbcaa026a1bfd26d2e5f8f91"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.755331 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hkwfz" event={"ID":"b48a43d4-438d-40a9-8982-74ea7020664a","Type":"ContainerStarted","Data":"921aa3b553a070990558baf7ecf8cb3eec29f5281d0ebd8e1c5285d280f0ba4f"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.758078 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9jk4q" event={"ID":"c62f6b93-9a26-4b21-a8b9-d86b7ab4e75c","Type":"ContainerStarted","Data":"f1811927ad69f998ada99f7cb0479efc317ce49618adf32233bd16709300ce11"} Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.758108 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-9jk4q" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.759501 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" podUID="0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3" containerName="controller-manager" containerID="cri-o://722b2831c7bf46342a36cc1f84d64be8d42bebbeb7af8ed6923de47bad87a84b" gracePeriod=30 Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.759830 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" podStartSLOduration=47.759820645 podStartE2EDuration="47.759820645s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:39.716038791 +0000 UTC m=+123.733109614" watchObservedRunningTime="2026-02-24 02:56:39.759820645 +0000 UTC m=+123.776891458" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.775437 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.776775 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" podUID="5b6bae49-7ab6-4aab-bed2-8e6507bc798a" containerName="route-controller-manager" containerID="cri-o://e19942ab5d5b6fad40c33c64990a9f107902231ad7f089b877604ce2a1a4b392" gracePeriod=30 Feb 24 02:56:39 crc kubenswrapper[4923]: E0224 02:56:39.777125 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:40.27710407 +0000 UTC m=+124.294174883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.862860 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-74xcz" podStartSLOduration=47.862845049 podStartE2EDuration="47.862845049s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:39.773736391 +0000 UTC m=+123.790807204" watchObservedRunningTime="2026-02-24 02:56:39.862845049 +0000 UTC m=+123.879915862" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.864202 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" podStartSLOduration=48.864196945 podStartE2EDuration="48.864196945s" podCreationTimestamp="2026-02-24 02:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:39.85375493 +0000 UTC m=+123.870825743" watchObservedRunningTime="2026-02-24 02:56:39.864196945 +0000 UTC m=+123.881267758" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.878261 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.886702 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" Feb 24 02:56:39 crc kubenswrapper[4923]: E0224 02:56:39.887029 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:40.387014346 +0000 UTC m=+124.404085159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.915324 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t4m2g"] Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.916184 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4m2g" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.921716 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.924445 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2jm5s" podStartSLOduration=47.924432272 podStartE2EDuration="47.924432272s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:39.922009308 +0000 UTC m=+123.939080121" watchObservedRunningTime="2026-02-24 02:56:39.924432272 +0000 UTC m=+123.941503085" Feb 24 02:56:39 crc kubenswrapper[4923]: I0224 02:56:39.946853 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t4m2g"] Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.017795 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:40 crc kubenswrapper[4923]: E0224 02:56:40.018001 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:40.517958526 +0000 UTC m=+124.535029329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.018142 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:40 crc kubenswrapper[4923]: E0224 02:56:40.018430 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:40.518418189 +0000 UTC m=+124.535489002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.058605 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-hkwfz" podStartSLOduration=48.058584417 podStartE2EDuration="48.058584417s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:40.022620649 +0000 UTC m=+124.039691462" watchObservedRunningTime="2026-02-24 02:56:40.058584417 +0000 UTC m=+124.075655230" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.091791 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-gg97h" podStartSLOduration=48.091776032 podStartE2EDuration="48.091776032s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:40.063174038 +0000 UTC m=+124.080244871" watchObservedRunningTime="2026-02-24 02:56:40.091776032 +0000 UTC m=+124.108846845" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.102355 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qpqqp"] Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.103252 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpqqp" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.103503 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vvpzd" podStartSLOduration=48.10348365 podStartE2EDuration="48.10348365s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:40.095363816 +0000 UTC m=+124.112434629" watchObservedRunningTime="2026-02-24 02:56:40.10348365 +0000 UTC m=+124.120554463" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.113680 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-clfml" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.119022 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:40 crc kubenswrapper[4923]: E0224 02:56:40.119113 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:40.619095881 +0000 UTC m=+124.636166684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.119500 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/906acf28-a57e-4f51-816e-5936cba1548f-utilities\") pod \"certified-operators-t4m2g\" (UID: \"906acf28-a57e-4f51-816e-5936cba1548f\") " pod="openshift-marketplace/certified-operators-t4m2g" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.119529 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/844ee205-faee-4873-978e-cf3d64cd8397-utilities\") pod \"community-operators-qpqqp\" (UID: \"844ee205-faee-4873-978e-cf3d64cd8397\") " pod="openshift-marketplace/community-operators-qpqqp" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.119556 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.119577 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thms4\" (UniqueName: \"kubernetes.io/projected/906acf28-a57e-4f51-816e-5936cba1548f-kube-api-access-thms4\") pod \"certified-operators-t4m2g\" (UID: \"906acf28-a57e-4f51-816e-5936cba1548f\") " pod="openshift-marketplace/certified-operators-t4m2g" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.119630 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75dbv\" (UniqueName: \"kubernetes.io/projected/844ee205-faee-4873-978e-cf3d64cd8397-kube-api-access-75dbv\") pod \"community-operators-qpqqp\" (UID: \"844ee205-faee-4873-978e-cf3d64cd8397\") " pod="openshift-marketplace/community-operators-qpqqp" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.119659 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/844ee205-faee-4873-978e-cf3d64cd8397-catalog-content\") pod \"community-operators-qpqqp\" (UID: \"844ee205-faee-4873-978e-cf3d64cd8397\") " pod="openshift-marketplace/community-operators-qpqqp" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.119693 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/906acf28-a57e-4f51-816e-5936cba1548f-catalog-content\") pod \"certified-operators-t4m2g\" (UID: \"906acf28-a57e-4f51-816e-5936cba1548f\") " pod="openshift-marketplace/certified-operators-t4m2g" Feb 24 02:56:40 crc kubenswrapper[4923]: E0224 02:56:40.119955 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:40.619947744 +0000 UTC m=+124.637018547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.122856 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.139846 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-q2s27" podStartSLOduration=48.139830048 podStartE2EDuration="48.139830048s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:40.137956568 +0000 UTC m=+124.155027381" watchObservedRunningTime="2026-02-24 02:56:40.139830048 +0000 UTC m=+124.156900861" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.157011 4923 patch_prober.go:28] interesting pod/router-default-5444994796-ctnr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 02:56:40 crc kubenswrapper[4923]: [-]has-synced failed: reason withheld Feb 24 02:56:40 crc kubenswrapper[4923]: [+]process-running ok Feb 24 02:56:40 crc kubenswrapper[4923]: healthz check failed Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.157057 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ctnr7" podUID="10a4e000-d2a9-455f-a7a7-ae4d90611c29" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.172288 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qpqqp"] Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.203587 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rx89l" podStartSLOduration=48.203570117 podStartE2EDuration="48.203570117s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:40.203412323 +0000 UTC m=+124.220483136" watchObservedRunningTime="2026-02-24 02:56:40.203570117 +0000 UTC m=+124.220640920" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.205435 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9jk4q" podStartSLOduration=8.205428526 podStartE2EDuration="8.205428526s" podCreationTimestamp="2026-02-24 02:56:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:40.181617399 +0000 UTC m=+124.198688212" watchObservedRunningTime="2026-02-24 02:56:40.205428526 +0000 UTC m=+124.222499339" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.221630 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.221858 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75dbv\" (UniqueName: \"kubernetes.io/projected/844ee205-faee-4873-978e-cf3d64cd8397-kube-api-access-75dbv\") pod \"community-operators-qpqqp\" (UID: \"844ee205-faee-4873-978e-cf3d64cd8397\") " pod="openshift-marketplace/community-operators-qpqqp" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.221901 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/844ee205-faee-4873-978e-cf3d64cd8397-catalog-content\") pod \"community-operators-qpqqp\" (UID: \"844ee205-faee-4873-978e-cf3d64cd8397\") " pod="openshift-marketplace/community-operators-qpqqp" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.221946 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/906acf28-a57e-4f51-816e-5936cba1548f-catalog-content\") pod \"certified-operators-t4m2g\" (UID: \"906acf28-a57e-4f51-816e-5936cba1548f\") " pod="openshift-marketplace/certified-operators-t4m2g" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.221973 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/906acf28-a57e-4f51-816e-5936cba1548f-utilities\") pod \"certified-operators-t4m2g\" (UID: \"906acf28-a57e-4f51-816e-5936cba1548f\") " pod="openshift-marketplace/certified-operators-t4m2g" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.221992 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/844ee205-faee-4873-978e-cf3d64cd8397-utilities\") pod \"community-operators-qpqqp\" (UID: \"844ee205-faee-4873-978e-cf3d64cd8397\") " pod="openshift-marketplace/community-operators-qpqqp" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.222018 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thms4\" (UniqueName: \"kubernetes.io/projected/906acf28-a57e-4f51-816e-5936cba1548f-kube-api-access-thms4\") pod \"certified-operators-t4m2g\" (UID: \"906acf28-a57e-4f51-816e-5936cba1548f\") " pod="openshift-marketplace/certified-operators-t4m2g" Feb 24 02:56:40 crc kubenswrapper[4923]: E0224 02:56:40.222277 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:40.72226041 +0000 UTC m=+124.739331223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.222916 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/844ee205-faee-4873-978e-cf3d64cd8397-catalog-content\") pod \"community-operators-qpqqp\" (UID: \"844ee205-faee-4873-978e-cf3d64cd8397\") " pod="openshift-marketplace/community-operators-qpqqp" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.223362 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/844ee205-faee-4873-978e-cf3d64cd8397-utilities\") pod \"community-operators-qpqqp\" (UID: \"844ee205-faee-4873-978e-cf3d64cd8397\") " pod="openshift-marketplace/community-operators-qpqqp" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.223604 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/906acf28-a57e-4f51-816e-5936cba1548f-catalog-content\") pod \"certified-operators-t4m2g\" (UID: \"906acf28-a57e-4f51-816e-5936cba1548f\") " pod="openshift-marketplace/certified-operators-t4m2g" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.225347 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/906acf28-a57e-4f51-816e-5936cba1548f-utilities\") pod \"certified-operators-t4m2g\" (UID: \"906acf28-a57e-4f51-816e-5936cba1548f\") " pod="openshift-marketplace/certified-operators-t4m2g" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.266275 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75dbv\" (UniqueName: \"kubernetes.io/projected/844ee205-faee-4873-978e-cf3d64cd8397-kube-api-access-75dbv\") pod \"community-operators-qpqqp\" (UID: \"844ee205-faee-4873-978e-cf3d64cd8397\") " pod="openshift-marketplace/community-operators-qpqqp" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.275389 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thms4\" (UniqueName: \"kubernetes.io/projected/906acf28-a57e-4f51-816e-5936cba1548f-kube-api-access-thms4\") pod \"certified-operators-t4m2g\" (UID: \"906acf28-a57e-4f51-816e-5936cba1548f\") " pod="openshift-marketplace/certified-operators-t4m2g" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.286734 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" podStartSLOduration=48.286717518 podStartE2EDuration="48.286717518s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:40.237194103 +0000 UTC m=+124.254264926" watchObservedRunningTime="2026-02-24 02:56:40.286717518 +0000 UTC m=+124.303788331" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.306061 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bccxn"] Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.311390 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bccxn" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.315270 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bccxn"] Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.323544 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hk45\" (UniqueName: \"kubernetes.io/projected/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b-kube-api-access-7hk45\") pod \"certified-operators-bccxn\" (UID: \"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b\") " pod="openshift-marketplace/certified-operators-bccxn" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.323642 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.323671 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b-catalog-content\") pod \"certified-operators-bccxn\" (UID: \"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b\") " pod="openshift-marketplace/certified-operators-bccxn" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.323752 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b-utilities\") pod \"certified-operators-bccxn\" (UID: \"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b\") " pod="openshift-marketplace/certified-operators-bccxn" Feb 24 02:56:40 crc kubenswrapper[4923]: E0224 02:56:40.323927 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:40.823914328 +0000 UTC m=+124.840985131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.394592 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4m2g" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.424497 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:40 crc kubenswrapper[4923]: E0224 02:56:40.424688 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:40.924658963 +0000 UTC m=+124.941729776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.424731 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.424797 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b-catalog-content\") pod \"certified-operators-bccxn\" (UID: \"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b\") " pod="openshift-marketplace/certified-operators-bccxn" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.424971 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b-utilities\") pod \"certified-operators-bccxn\" (UID: \"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b\") " pod="openshift-marketplace/certified-operators-bccxn" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.425006 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hk45\" (UniqueName: \"kubernetes.io/projected/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b-kube-api-access-7hk45\") pod \"certified-operators-bccxn\" (UID: \"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b\") " pod="openshift-marketplace/certified-operators-bccxn" Feb 24 02:56:40 crc kubenswrapper[4923]: E0224 02:56:40.425105 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:40.925090934 +0000 UTC m=+124.942161747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.425645 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b-catalog-content\") pod \"certified-operators-bccxn\" (UID: \"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b\") " pod="openshift-marketplace/certified-operators-bccxn" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.425754 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b-utilities\") pod \"certified-operators-bccxn\" (UID: \"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b\") " pod="openshift-marketplace/certified-operators-bccxn" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.433029 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpqqp" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.450243 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.465002 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hk45\" (UniqueName: \"kubernetes.io/projected/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b-kube-api-access-7hk45\") pod \"certified-operators-bccxn\" (UID: \"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b\") " pod="openshift-marketplace/certified-operators-bccxn" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.506015 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-77dzs"] Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.514979 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-77dzs" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.527044 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.527148 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/770b43e6-56e5-4d30-9d35-f3ce4dcf3563-utilities\") pod \"community-operators-77dzs\" (UID: \"770b43e6-56e5-4d30-9d35-f3ce4dcf3563\") " pod="openshift-marketplace/community-operators-77dzs" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.527239 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvwx8\" (UniqueName: \"kubernetes.io/projected/770b43e6-56e5-4d30-9d35-f3ce4dcf3563-kube-api-access-xvwx8\") pod \"community-operators-77dzs\" (UID: \"770b43e6-56e5-4d30-9d35-f3ce4dcf3563\") " pod="openshift-marketplace/community-operators-77dzs" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.527264 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/770b43e6-56e5-4d30-9d35-f3ce4dcf3563-catalog-content\") pod \"community-operators-77dzs\" (UID: \"770b43e6-56e5-4d30-9d35-f3ce4dcf3563\") " pod="openshift-marketplace/community-operators-77dzs" Feb 24 02:56:40 crc kubenswrapper[4923]: E0224 02:56:40.527360 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:41.027346829 +0000 UTC m=+125.044417642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.584889 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-77dzs"] Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.615966 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6txs5" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.637212 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvwx8\" (UniqueName: \"kubernetes.io/projected/770b43e6-56e5-4d30-9d35-f3ce4dcf3563-kube-api-access-xvwx8\") pod \"community-operators-77dzs\" (UID: \"770b43e6-56e5-4d30-9d35-f3ce4dcf3563\") " pod="openshift-marketplace/community-operators-77dzs" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.637251 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.637272 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/770b43e6-56e5-4d30-9d35-f3ce4dcf3563-catalog-content\") pod \"community-operators-77dzs\" (UID: \"770b43e6-56e5-4d30-9d35-f3ce4dcf3563\") " pod="openshift-marketplace/community-operators-77dzs" Feb 24 02:56:40 crc kubenswrapper[4923]: E0224 02:56:40.637546 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:41.137535852 +0000 UTC m=+125.154606665 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.637711 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/770b43e6-56e5-4d30-9d35-f3ce4dcf3563-utilities\") pod \"community-operators-77dzs\" (UID: \"770b43e6-56e5-4d30-9d35-f3ce4dcf3563\") " pod="openshift-marketplace/community-operators-77dzs" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.638148 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/770b43e6-56e5-4d30-9d35-f3ce4dcf3563-utilities\") pod \"community-operators-77dzs\" (UID: \"770b43e6-56e5-4d30-9d35-f3ce4dcf3563\") " pod="openshift-marketplace/community-operators-77dzs" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.638223 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/770b43e6-56e5-4d30-9d35-f3ce4dcf3563-catalog-content\") pod \"community-operators-77dzs\" (UID: \"770b43e6-56e5-4d30-9d35-f3ce4dcf3563\") " pod="openshift-marketplace/community-operators-77dzs" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.638406 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bccxn" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.662442 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvwx8\" (UniqueName: \"kubernetes.io/projected/770b43e6-56e5-4d30-9d35-f3ce4dcf3563-kube-api-access-xvwx8\") pod \"community-operators-77dzs\" (UID: \"770b43e6-56e5-4d30-9d35-f3ce4dcf3563\") " pod="openshift-marketplace/community-operators-77dzs" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.740128 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:40 crc kubenswrapper[4923]: E0224 02:56:40.740434 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:41.240418653 +0000 UTC m=+125.257489466 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.776849 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-zswn9"] Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.824761 4923 generic.go:334] "Generic (PLEG): container finished" podID="a99e06cc-b200-4073-a847-410f9799eb3a" containerID="a76c0488461ed909bb928cc0aadfa91454b9c0426b9fb3c1fef88d255d4142a4" exitCode=0 Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.825069 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc" event={"ID":"a99e06cc-b200-4073-a847-410f9799eb3a","Type":"ContainerDied","Data":"a76c0488461ed909bb928cc0aadfa91454b9c0426b9fb3c1fef88d255d4142a4"} Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.830376 4923 generic.go:334] "Generic (PLEG): container finished" podID="5b6bae49-7ab6-4aab-bed2-8e6507bc798a" containerID="e19942ab5d5b6fad40c33c64990a9f107902231ad7f089b877604ce2a1a4b392" exitCode=0 Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.830442 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" event={"ID":"5b6bae49-7ab6-4aab-bed2-8e6507bc798a","Type":"ContainerDied","Data":"e19942ab5d5b6fad40c33c64990a9f107902231ad7f089b877604ce2a1a4b392"} Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.831410 4923 generic.go:334] "Generic (PLEG): container finished" podID="0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3" containerID="722b2831c7bf46342a36cc1f84d64be8d42bebbeb7af8ed6923de47bad87a84b" exitCode=0 Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.831448 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" event={"ID":"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3","Type":"ContainerDied","Data":"722b2831c7bf46342a36cc1f84d64be8d42bebbeb7af8ed6923de47bad87a84b"} Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.832978 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-96lgm" event={"ID":"ea015236-90a9-4feb-bc5a-800e9831fcd6","Type":"ContainerStarted","Data":"e687356271520ed56ef0f43d55ae547a4ad91f7755bb9c7dff24a3e73bee5f3e"} Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.835861 4923 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-292s5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.835907 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" podUID="b66c0222-b5e0-4d1e-841e-507c8e61e482" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.845143 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:40 crc kubenswrapper[4923]: E0224 02:56:40.845611 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:41.345598095 +0000 UTC m=+125.362668908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.873743 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-77dzs" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.945941 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:40 crc kubenswrapper[4923]: E0224 02:56:40.948271 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:41.44825513 +0000 UTC m=+125.465325943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.948547 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.987011 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7548dc4f95-bb92n"] Feb 24 02:56:40 crc kubenswrapper[4923]: E0224 02:56:40.987434 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3" containerName="controller-manager" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.987448 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3" containerName="controller-manager" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.987536 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3" containerName="controller-manager" Feb 24 02:56:40 crc kubenswrapper[4923]: I0224 02:56:40.987874 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.049511 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-config\") pod \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.049548 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-serving-cert\") pod \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.049583 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-client-ca\") pod \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.049623 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9nhg\" (UniqueName: \"kubernetes.io/projected/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-kube-api-access-p9nhg\") pod \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.049650 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-proxy-ca-bundles\") pod \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\" (UID: \"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3\") " Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.049857 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:41 crc kubenswrapper[4923]: E0224 02:56:41.050178 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:41.550167025 +0000 UTC m=+125.567237838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.053581 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-config" (OuterVolumeSpecName: "config") pod "0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3" (UID: "0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.054162 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3" (UID: "0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.061372 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-client-ca" (OuterVolumeSpecName: "client-ca") pod "0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3" (UID: "0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.064614 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7548dc4f95-bb92n"] Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.073245 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3" (UID: "0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.073391 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-kube-api-access-p9nhg" (OuterVolumeSpecName: "kube-api-access-p9nhg") pod "0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3" (UID: "0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3"). InnerVolumeSpecName "kube-api-access-p9nhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.083725 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.159947 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.160516 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/120499d4-e031-4ef7-81c9-099cd0d710e7-proxy-ca-bundles\") pod \"controller-manager-7548dc4f95-bb92n\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.160582 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hzml\" (UniqueName: \"kubernetes.io/projected/120499d4-e031-4ef7-81c9-099cd0d710e7-kube-api-access-6hzml\") pod \"controller-manager-7548dc4f95-bb92n\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.160608 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120499d4-e031-4ef7-81c9-099cd0d710e7-config\") pod \"controller-manager-7548dc4f95-bb92n\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.160643 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/120499d4-e031-4ef7-81c9-099cd0d710e7-client-ca\") pod \"controller-manager-7548dc4f95-bb92n\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.160672 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/120499d4-e031-4ef7-81c9-099cd0d710e7-serving-cert\") pod \"controller-manager-7548dc4f95-bb92n\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.160780 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9nhg\" (UniqueName: \"kubernetes.io/projected/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-kube-api-access-p9nhg\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.160795 4923 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.160805 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.160816 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.160838 4923 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:41 crc kubenswrapper[4923]: E0224 02:56:41.160926 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:41.660907053 +0000 UTC m=+125.677977866 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.170913 4923 patch_prober.go:28] interesting pod/router-default-5444994796-ctnr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 02:56:41 crc kubenswrapper[4923]: [-]has-synced failed: reason withheld Feb 24 02:56:41 crc kubenswrapper[4923]: [+]process-running ok Feb 24 02:56:41 crc kubenswrapper[4923]: healthz check failed Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.170955 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ctnr7" podUID="10a4e000-d2a9-455f-a7a7-ae4d90611c29" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.194563 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t4m2g"] Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.257239 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qpqqp"] Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.263501 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-client-ca\") pod \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\" (UID: \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\") " Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.263581 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-config\") pod \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\" (UID: \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\") " Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.263625 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-serving-cert\") pod \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\" (UID: \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\") " Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.267643 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m549\" (UniqueName: \"kubernetes.io/projected/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-kube-api-access-7m549\") pod \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\" (UID: \"5b6bae49-7ab6-4aab-bed2-8e6507bc798a\") " Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.267944 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hzml\" (UniqueName: \"kubernetes.io/projected/120499d4-e031-4ef7-81c9-099cd0d710e7-kube-api-access-6hzml\") pod \"controller-manager-7548dc4f95-bb92n\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.267975 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120499d4-e031-4ef7-81c9-099cd0d710e7-config\") pod \"controller-manager-7548dc4f95-bb92n\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.268007 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/120499d4-e031-4ef7-81c9-099cd0d710e7-client-ca\") pod \"controller-manager-7548dc4f95-bb92n\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.268033 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/120499d4-e031-4ef7-81c9-099cd0d710e7-serving-cert\") pod \"controller-manager-7548dc4f95-bb92n\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.268077 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.268138 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/120499d4-e031-4ef7-81c9-099cd0d710e7-proxy-ca-bundles\") pod \"controller-manager-7548dc4f95-bb92n\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.268454 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-client-ca" (OuterVolumeSpecName: "client-ca") pod "5b6bae49-7ab6-4aab-bed2-8e6507bc798a" (UID: "5b6bae49-7ab6-4aab-bed2-8e6507bc798a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.269510 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/120499d4-e031-4ef7-81c9-099cd0d710e7-proxy-ca-bundles\") pod \"controller-manager-7548dc4f95-bb92n\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.270630 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-config" (OuterVolumeSpecName: "config") pod "5b6bae49-7ab6-4aab-bed2-8e6507bc798a" (UID: "5b6bae49-7ab6-4aab-bed2-8e6507bc798a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.274675 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/120499d4-e031-4ef7-81c9-099cd0d710e7-client-ca\") pod \"controller-manager-7548dc4f95-bb92n\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.275850 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120499d4-e031-4ef7-81c9-099cd0d710e7-config\") pod \"controller-manager-7548dc4f95-bb92n\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.287136 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bccxn"] Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.287894 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-kube-api-access-7m549" (OuterVolumeSpecName: "kube-api-access-7m549") pod "5b6bae49-7ab6-4aab-bed2-8e6507bc798a" (UID: "5b6bae49-7ab6-4aab-bed2-8e6507bc798a"). InnerVolumeSpecName "kube-api-access-7m549". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.289608 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/120499d4-e031-4ef7-81c9-099cd0d710e7-serving-cert\") pod \"controller-manager-7548dc4f95-bb92n\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:41 crc kubenswrapper[4923]: E0224 02:56:41.299284 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:41.79361203 +0000 UTC m=+125.810682843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.301675 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5b6bae49-7ab6-4aab-bed2-8e6507bc798a" (UID: "5b6bae49-7ab6-4aab-bed2-8e6507bc798a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.322440 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hzml\" (UniqueName: \"kubernetes.io/projected/120499d4-e031-4ef7-81c9-099cd0d710e7-kube-api-access-6hzml\") pod \"controller-manager-7548dc4f95-bb92n\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.349335 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.369008 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:41 crc kubenswrapper[4923]: E0224 02:56:41.369318 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:41.869278044 +0000 UTC m=+125.886348847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.369353 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.369479 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.369493 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m549\" (UniqueName: \"kubernetes.io/projected/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-kube-api-access-7m549\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.369502 4923 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.369511 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b6bae49-7ab6-4aab-bed2-8e6507bc798a-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:41 crc kubenswrapper[4923]: E0224 02:56:41.369801 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:41.869789037 +0000 UTC m=+125.886859850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.413786 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-77dzs"] Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.471358 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:41 crc kubenswrapper[4923]: E0224 02:56:41.472057 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:41.972041681 +0000 UTC m=+125.989112494 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.588991 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:41 crc kubenswrapper[4923]: E0224 02:56:41.589359 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:42.089345872 +0000 UTC m=+126.106416675 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.632008 4923 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.639190 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7548dc4f95-bb92n"] Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.689858 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:41 crc kubenswrapper[4923]: E0224 02:56:41.690062 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:42.190029705 +0000 UTC m=+126.207100518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.690225 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:41 crc kubenswrapper[4923]: E0224 02:56:41.690717 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:42.190706143 +0000 UTC m=+126.207776956 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:41 crc kubenswrapper[4923]: W0224 02:56:41.697895 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod120499d4_e031_4ef7_81c9_099cd0d710e7.slice/crio-89970fe047f070593bd807b39bd63470e8a1c114943a2f496748a4a96529f408 WatchSource:0}: Error finding container 89970fe047f070593bd807b39bd63470e8a1c114943a2f496748a4a96529f408: Status 404 returned error can't find the container with id 89970fe047f070593bd807b39bd63470e8a1c114943a2f496748a4a96529f408 Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.723837 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 24 02:56:41 crc kubenswrapper[4923]: E0224 02:56:41.724072 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b6bae49-7ab6-4aab-bed2-8e6507bc798a" containerName="route-controller-manager" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.724086 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b6bae49-7ab6-4aab-bed2-8e6507bc798a" containerName="route-controller-manager" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.724198 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b6bae49-7ab6-4aab-bed2-8e6507bc798a" containerName="route-controller-manager" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.724562 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.728664 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.729069 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.733630 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.791279 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:41 crc kubenswrapper[4923]: E0224 02:56:41.791435 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:42.291405947 +0000 UTC m=+126.308476760 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.791782 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17fe389c-3464-4891-b53f-9e351e0b38c9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"17fe389c-3464-4891-b53f-9e351e0b38c9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.791967 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.792097 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17fe389c-3464-4891-b53f-9e351e0b38c9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"17fe389c-3464-4891-b53f-9e351e0b38c9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 02:56:41 crc kubenswrapper[4923]: E0224 02:56:41.792280 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:42.292272099 +0000 UTC m=+126.309342912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.839903 4923 generic.go:334] "Generic (PLEG): container finished" podID="cba8789a-c6f5-4fb3-93a9-ec12a41dba0b" containerID="2f828a483ea84d110ea0a95f1b2840b4e1f302a4e888535ad6aa592574f317e3" exitCode=0 Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.839979 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bccxn" event={"ID":"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b","Type":"ContainerDied","Data":"2f828a483ea84d110ea0a95f1b2840b4e1f302a4e888535ad6aa592574f317e3"} Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.840006 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bccxn" event={"ID":"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b","Type":"ContainerStarted","Data":"59288b830f49e99efa75f16f90816c7554fed38536a7238319b1b58b931c2a92"} Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.841477 4923 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.841531 4923 generic.go:334] "Generic (PLEG): container finished" podID="844ee205-faee-4873-978e-cf3d64cd8397" containerID="a9339cf817cbbde77d284f2abdab640473979b107e8ff53679e1d82b755c26be" exitCode=0 Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.841628 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpqqp" event={"ID":"844ee205-faee-4873-978e-cf3d64cd8397","Type":"ContainerDied","Data":"a9339cf817cbbde77d284f2abdab640473979b107e8ff53679e1d82b755c26be"} Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.841655 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpqqp" event={"ID":"844ee205-faee-4873-978e-cf3d64cd8397","Type":"ContainerStarted","Data":"6e38422ca8a6e962169ac1f5bfcd7b1515a5037100a205c8bf8249cfe2cddcb0"} Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.844252 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" event={"ID":"120499d4-e031-4ef7-81c9-099cd0d710e7","Type":"ContainerStarted","Data":"89970fe047f070593bd807b39bd63470e8a1c114943a2f496748a4a96529f408"} Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.845929 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" event={"ID":"5b6bae49-7ab6-4aab-bed2-8e6507bc798a","Type":"ContainerDied","Data":"a6158bb003b1aa500d8f67f11e5df1afe366bf26337e8b752c84e047fa6ea7cc"} Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.845986 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.846022 4923 scope.go:117] "RemoveContainer" containerID="e19942ab5d5b6fad40c33c64990a9f107902231ad7f089b877604ce2a1a4b392" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.848235 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77dzs" event={"ID":"770b43e6-56e5-4d30-9d35-f3ce4dcf3563","Type":"ContainerStarted","Data":"c7df0189f731f3347b2034f7069e3614242fe0279a7d0be393be207fd09691fd"} Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.849789 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" event={"ID":"0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3","Type":"ContainerDied","Data":"5156548ad67fd644c1606739f4d8eed57d03aec9fedce9fcf97a3d107ed018a6"} Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.849887 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-k2q5j" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.851154 4923 generic.go:334] "Generic (PLEG): container finished" podID="906acf28-a57e-4f51-816e-5936cba1548f" containerID="ddc663eee94e5aff1d89fddabba337ccc8aa328ffa9f5ecae727211726ba178c" exitCode=0 Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.851253 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4m2g" event={"ID":"906acf28-a57e-4f51-816e-5936cba1548f","Type":"ContainerDied","Data":"ddc663eee94e5aff1d89fddabba337ccc8aa328ffa9f5ecae727211726ba178c"} Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.851283 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4m2g" event={"ID":"906acf28-a57e-4f51-816e-5936cba1548f","Type":"ContainerStarted","Data":"2a9f9fabfc140c26678952b582be04b38082d65f75328dade37f46aba3f08d8c"} Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.858406 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-96lgm" event={"ID":"ea015236-90a9-4feb-bc5a-800e9831fcd6","Type":"ContainerStarted","Data":"d34af577d6cbc57c35603abfc87ac0528de6b91b6bf131433174a0463d886622"} Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.858692 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" podUID="30e1ab71-a068-4593-9dc7-f1f7731caeb9" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://1a1da2c6ea0d1ed6fc17482ec19f3efcc2ae40122953a75eb0c2f51c01ec110f" gracePeriod=30 Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.865786 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.882345 4923 scope.go:117] "RemoveContainer" containerID="722b2831c7bf46342a36cc1f84d64be8d42bebbeb7af8ed6923de47bad87a84b" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.890408 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k5lw8"] Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.891605 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5lw8" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.895198 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.900153 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 24 02:56:41 crc kubenswrapper[4923]: E0224 02:56:41.900455 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 02:56:42.400436399 +0000 UTC m=+126.417507202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.900931 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c62fecb-b531-4754-a747-902b75b2350d-utilities\") pod \"redhat-marketplace-k5lw8\" (UID: \"3c62fecb-b531-4754-a747-902b75b2350d\") " pod="openshift-marketplace/redhat-marketplace-k5lw8" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.900970 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.901005 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17fe389c-3464-4891-b53f-9e351e0b38c9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"17fe389c-3464-4891-b53f-9e351e0b38c9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.901040 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c62fecb-b531-4754-a747-902b75b2350d-catalog-content\") pod \"redhat-marketplace-k5lw8\" (UID: \"3c62fecb-b531-4754-a747-902b75b2350d\") " pod="openshift-marketplace/redhat-marketplace-k5lw8" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.901074 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf9mj\" (UniqueName: \"kubernetes.io/projected/3c62fecb-b531-4754-a747-902b75b2350d-kube-api-access-gf9mj\") pod \"redhat-marketplace-k5lw8\" (UID: \"3c62fecb-b531-4754-a747-902b75b2350d\") " pod="openshift-marketplace/redhat-marketplace-k5lw8" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.901170 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17fe389c-3464-4891-b53f-9e351e0b38c9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"17fe389c-3464-4891-b53f-9e351e0b38c9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.901253 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17fe389c-3464-4891-b53f-9e351e0b38c9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"17fe389c-3464-4891-b53f-9e351e0b38c9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 02:56:41 crc kubenswrapper[4923]: E0224 02:56:41.901786 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 02:56:42.401774284 +0000 UTC m=+126.418845177 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-95gv5" (UID: "7227def5-b373-488f-9f56-4b6ed170751d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.903379 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5lw8"] Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.905871 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k2q5j"] Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.914285 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-k2q5j"] Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.918455 4923 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-24T02:56:41.632032537Z","Handler":null,"Name":""} Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.928487 4923 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.928534 4923 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.939120 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548"] Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.942177 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tw548"] Feb 24 02:56:41 crc kubenswrapper[4923]: I0224 02:56:41.955749 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17fe389c-3464-4891-b53f-9e351e0b38c9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"17fe389c-3464-4891-b53f-9e351e0b38c9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.006242 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.007695 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c62fecb-b531-4754-a747-902b75b2350d-utilities\") pod \"redhat-marketplace-k5lw8\" (UID: \"3c62fecb-b531-4754-a747-902b75b2350d\") " pod="openshift-marketplace/redhat-marketplace-k5lw8" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.007800 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c62fecb-b531-4754-a747-902b75b2350d-catalog-content\") pod \"redhat-marketplace-k5lw8\" (UID: \"3c62fecb-b531-4754-a747-902b75b2350d\") " pod="openshift-marketplace/redhat-marketplace-k5lw8" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.007918 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf9mj\" (UniqueName: \"kubernetes.io/projected/3c62fecb-b531-4754-a747-902b75b2350d-kube-api-access-gf9mj\") pod \"redhat-marketplace-k5lw8\" (UID: \"3c62fecb-b531-4754-a747-902b75b2350d\") " pod="openshift-marketplace/redhat-marketplace-k5lw8" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.008170 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c62fecb-b531-4754-a747-902b75b2350d-utilities\") pod \"redhat-marketplace-k5lw8\" (UID: \"3c62fecb-b531-4754-a747-902b75b2350d\") " pod="openshift-marketplace/redhat-marketplace-k5lw8" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.008483 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c62fecb-b531-4754-a747-902b75b2350d-catalog-content\") pod \"redhat-marketplace-k5lw8\" (UID: \"3c62fecb-b531-4754-a747-902b75b2350d\") " pod="openshift-marketplace/redhat-marketplace-k5lw8" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.011061 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.043916 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf9mj\" (UniqueName: \"kubernetes.io/projected/3c62fecb-b531-4754-a747-902b75b2350d-kube-api-access-gf9mj\") pod \"redhat-marketplace-k5lw8\" (UID: \"3c62fecb-b531-4754-a747-902b75b2350d\") " pod="openshift-marketplace/redhat-marketplace-k5lw8" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.089611 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.109135 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a99e06cc-b200-4073-a847-410f9799eb3a-config-volume\") pod \"a99e06cc-b200-4073-a847-410f9799eb3a\" (UID: \"a99e06cc-b200-4073-a847-410f9799eb3a\") " Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.109262 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46qbb\" (UniqueName: \"kubernetes.io/projected/a99e06cc-b200-4073-a847-410f9799eb3a-kube-api-access-46qbb\") pod \"a99e06cc-b200-4073-a847-410f9799eb3a\" (UID: \"a99e06cc-b200-4073-a847-410f9799eb3a\") " Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.109315 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a99e06cc-b200-4073-a847-410f9799eb3a-secret-volume\") pod \"a99e06cc-b200-4073-a847-410f9799eb3a\" (UID: \"a99e06cc-b200-4073-a847-410f9799eb3a\") " Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.109454 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.109867 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a99e06cc-b200-4073-a847-410f9799eb3a-config-volume" (OuterVolumeSpecName: "config-volume") pod "a99e06cc-b200-4073-a847-410f9799eb3a" (UID: "a99e06cc-b200-4073-a847-410f9799eb3a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.110273 4923 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a99e06cc-b200-4073-a847-410f9799eb3a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.113752 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a99e06cc-b200-4073-a847-410f9799eb3a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a99e06cc-b200-4073-a847-410f9799eb3a" (UID: "a99e06cc-b200-4073-a847-410f9799eb3a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.114090 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a99e06cc-b200-4073-a847-410f9799eb3a-kube-api-access-46qbb" (OuterVolumeSpecName: "kube-api-access-46qbb") pod "a99e06cc-b200-4073-a847-410f9799eb3a" (UID: "a99e06cc-b200-4073-a847-410f9799eb3a"). InnerVolumeSpecName "kube-api-access-46qbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.115509 4923 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.115546 4923 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.144897 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.150662 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-95gv5\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.158774 4923 patch_prober.go:28] interesting pod/router-default-5444994796-ctnr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 02:56:42 crc kubenswrapper[4923]: [-]has-synced failed: reason withheld Feb 24 02:56:42 crc kubenswrapper[4923]: [+]process-running ok Feb 24 02:56:42 crc kubenswrapper[4923]: healthz check failed Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.158840 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ctnr7" podUID="10a4e000-d2a9-455f-a7a7-ae4d90611c29" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.211939 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46qbb\" (UniqueName: \"kubernetes.io/projected/a99e06cc-b200-4073-a847-410f9799eb3a-kube-api-access-46qbb\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.212233 4923 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a99e06cc-b200-4073-a847-410f9799eb3a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.225801 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5lw8" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.282416 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4mdqx"] Feb 24 02:56:42 crc kubenswrapper[4923]: E0224 02:56:42.282890 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a99e06cc-b200-4073-a847-410f9799eb3a" containerName="collect-profiles" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.282901 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99e06cc-b200-4073-a847-410f9799eb3a" containerName="collect-profiles" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.282988 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="a99e06cc-b200-4073-a847-410f9799eb3a" containerName="collect-profiles" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.283791 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mdqx" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.300922 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mdqx"] Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.312893 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f219509-d66e-49c2-bc60-9de19d02d5f0-catalog-content\") pod \"redhat-marketplace-4mdqx\" (UID: \"1f219509-d66e-49c2-bc60-9de19d02d5f0\") " pod="openshift-marketplace/redhat-marketplace-4mdqx" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.312972 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s8kd\" (UniqueName: \"kubernetes.io/projected/1f219509-d66e-49c2-bc60-9de19d02d5f0-kube-api-access-7s8kd\") pod \"redhat-marketplace-4mdqx\" (UID: \"1f219509-d66e-49c2-bc60-9de19d02d5f0\") " pod="openshift-marketplace/redhat-marketplace-4mdqx" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.313028 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f219509-d66e-49c2-bc60-9de19d02d5f0-utilities\") pod \"redhat-marketplace-4mdqx\" (UID: \"1f219509-d66e-49c2-bc60-9de19d02d5f0\") " pod="openshift-marketplace/redhat-marketplace-4mdqx" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.343691 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.352131 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.413350 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f219509-d66e-49c2-bc60-9de19d02d5f0-catalog-content\") pod \"redhat-marketplace-4mdqx\" (UID: \"1f219509-d66e-49c2-bc60-9de19d02d5f0\") " pod="openshift-marketplace/redhat-marketplace-4mdqx" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.413420 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s8kd\" (UniqueName: \"kubernetes.io/projected/1f219509-d66e-49c2-bc60-9de19d02d5f0-kube-api-access-7s8kd\") pod \"redhat-marketplace-4mdqx\" (UID: \"1f219509-d66e-49c2-bc60-9de19d02d5f0\") " pod="openshift-marketplace/redhat-marketplace-4mdqx" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.413459 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f219509-d66e-49c2-bc60-9de19d02d5f0-utilities\") pod \"redhat-marketplace-4mdqx\" (UID: \"1f219509-d66e-49c2-bc60-9de19d02d5f0\") " pod="openshift-marketplace/redhat-marketplace-4mdqx" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.414042 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f219509-d66e-49c2-bc60-9de19d02d5f0-utilities\") pod \"redhat-marketplace-4mdqx\" (UID: \"1f219509-d66e-49c2-bc60-9de19d02d5f0\") " pod="openshift-marketplace/redhat-marketplace-4mdqx" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.414073 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f219509-d66e-49c2-bc60-9de19d02d5f0-catalog-content\") pod \"redhat-marketplace-4mdqx\" (UID: \"1f219509-d66e-49c2-bc60-9de19d02d5f0\") " pod="openshift-marketplace/redhat-marketplace-4mdqx" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.432284 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s8kd\" (UniqueName: \"kubernetes.io/projected/1f219509-d66e-49c2-bc60-9de19d02d5f0-kube-api-access-7s8kd\") pod \"redhat-marketplace-4mdqx\" (UID: \"1f219509-d66e-49c2-bc60-9de19d02d5f0\") " pod="openshift-marketplace/redhat-marketplace-4mdqx" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.600716 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mdqx" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.607258 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.751312 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5lw8"] Feb 24 02:56:42 crc kubenswrapper[4923]: W0224 02:56:42.763335 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c62fecb_b531_4754_a747_902b75b2350d.slice/crio-56e185d2688874812cad1c013b4bf7af4a5f52606cb6e4eb556e03276fd97891 WatchSource:0}: Error finding container 56e185d2688874812cad1c013b4bf7af4a5f52606cb6e4eb556e03276fd97891: Status 404 returned error can't find the container with id 56e185d2688874812cad1c013b4bf7af4a5f52606cb6e4eb556e03276fd97891 Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.813970 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-95gv5"] Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.886961 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" event={"ID":"120499d4-e031-4ef7-81c9-099cd0d710e7","Type":"ContainerStarted","Data":"d9c3e65d0b4bbda1cc260458c07a5bbc8802cbd43c285b5e8040d1bbe208867c"} Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.887898 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.893670 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc" event={"ID":"a99e06cc-b200-4073-a847-410f9799eb3a","Type":"ContainerDied","Data":"e0f30223384b9fe3815bb5416883278d6902c2c962783c41cbcea5c777ec12c7"} Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.893694 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.893709 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0f30223384b9fe3815bb5416883278d6902c2c962783c41cbcea5c777ec12c7" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.895995 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" event={"ID":"7227def5-b373-488f-9f56-4b6ed170751d","Type":"ContainerStarted","Data":"f6ea8b0bba77d4eb42b7e9e465d87d18d8bc41669adafd50d2229f607ea7f8df"} Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.899790 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.945846 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" podStartSLOduration=4.945828735 podStartE2EDuration="4.945828735s" podCreationTimestamp="2026-02-24 02:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:42.92364644 +0000 UTC m=+126.940717253" watchObservedRunningTime="2026-02-24 02:56:42.945828735 +0000 UTC m=+126.962899548" Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.953135 4923 generic.go:334] "Generic (PLEG): container finished" podID="770b43e6-56e5-4d30-9d35-f3ce4dcf3563" containerID="51c1d73e8b49455503822c0095614f4124f12ff27fbc205ba8cd060815f02ccb" exitCode=0 Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.953448 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77dzs" event={"ID":"770b43e6-56e5-4d30-9d35-f3ce4dcf3563","Type":"ContainerDied","Data":"51c1d73e8b49455503822c0095614f4124f12ff27fbc205ba8cd060815f02ccb"} Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.959374 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mdqx"] Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.964925 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5lw8" event={"ID":"3c62fecb-b531-4754-a747-902b75b2350d","Type":"ContainerStarted","Data":"56e185d2688874812cad1c013b4bf7af4a5f52606cb6e4eb556e03276fd97891"} Feb 24 02:56:42 crc kubenswrapper[4923]: I0224 02:56:42.984769 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-96lgm" event={"ID":"ea015236-90a9-4feb-bc5a-800e9831fcd6","Type":"ContainerStarted","Data":"6153f7cfbcc5bbafd47d64d4006c082ce845d8b17c6af22bf289969d4e712302"} Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.002255 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"17fe389c-3464-4891-b53f-9e351e0b38c9","Type":"ContainerStarted","Data":"a9a2191eff978b01d3d90cfc8beb17dc359756af6c457cf23483888077e97512"} Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.014504 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-96lgm" podStartSLOduration=11.014490224 podStartE2EDuration="11.014490224s" podCreationTimestamp="2026-02-24 02:56:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:43.014044772 +0000 UTC m=+127.031115585" watchObservedRunningTime="2026-02-24 02:56:43.014490224 +0000 UTC m=+127.031561037" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.167479 4923 patch_prober.go:28] interesting pod/router-default-5444994796-ctnr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 02:56:43 crc kubenswrapper[4923]: [-]has-synced failed: reason withheld Feb 24 02:56:43 crc kubenswrapper[4923]: [+]process-running ok Feb 24 02:56:43 crc kubenswrapper[4923]: healthz check failed Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.167532 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ctnr7" podUID="10a4e000-d2a9-455f-a7a7-ae4d90611c29" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.288421 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f4g69"] Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.289390 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f4g69" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.292025 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.302991 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f4g69"] Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.441535 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzrh8\" (UniqueName: \"kubernetes.io/projected/db365e80-350f-4a1f-955c-5d73c4704241-kube-api-access-lzrh8\") pod \"redhat-operators-f4g69\" (UID: \"db365e80-350f-4a1f-955c-5d73c4704241\") " pod="openshift-marketplace/redhat-operators-f4g69" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.441592 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db365e80-350f-4a1f-955c-5d73c4704241-catalog-content\") pod \"redhat-operators-f4g69\" (UID: \"db365e80-350f-4a1f-955c-5d73c4704241\") " pod="openshift-marketplace/redhat-operators-f4g69" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.441628 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db365e80-350f-4a1f-955c-5d73c4704241-utilities\") pod \"redhat-operators-f4g69\" (UID: \"db365e80-350f-4a1f-955c-5d73c4704241\") " pod="openshift-marketplace/redhat-operators-f4g69" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.537367 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.538810 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.540799 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.542214 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.542522 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzrh8\" (UniqueName: \"kubernetes.io/projected/db365e80-350f-4a1f-955c-5d73c4704241-kube-api-access-lzrh8\") pod \"redhat-operators-f4g69\" (UID: \"db365e80-350f-4a1f-955c-5d73c4704241\") " pod="openshift-marketplace/redhat-operators-f4g69" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.542565 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db365e80-350f-4a1f-955c-5d73c4704241-catalog-content\") pod \"redhat-operators-f4g69\" (UID: \"db365e80-350f-4a1f-955c-5d73c4704241\") " pod="openshift-marketplace/redhat-operators-f4g69" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.542612 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db365e80-350f-4a1f-955c-5d73c4704241-utilities\") pod \"redhat-operators-f4g69\" (UID: \"db365e80-350f-4a1f-955c-5d73c4704241\") " pod="openshift-marketplace/redhat-operators-f4g69" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.560162 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.567957 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc"] Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.568742 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.572842 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.573007 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.573168 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.573386 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.573476 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.574221 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.581529 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzrh8\" (UniqueName: \"kubernetes.io/projected/db365e80-350f-4a1f-955c-5d73c4704241-kube-api-access-lzrh8\") pod \"redhat-operators-f4g69\" (UID: \"db365e80-350f-4a1f-955c-5d73c4704241\") " pod="openshift-marketplace/redhat-operators-f4g69" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.593982 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db365e80-350f-4a1f-955c-5d73c4704241-utilities\") pod \"redhat-operators-f4g69\" (UID: \"db365e80-350f-4a1f-955c-5d73c4704241\") " pod="openshift-marketplace/redhat-operators-f4g69" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.594212 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db365e80-350f-4a1f-955c-5d73c4704241-catalog-content\") pod \"redhat-operators-f4g69\" (UID: \"db365e80-350f-4a1f-955c-5d73c4704241\") " pod="openshift-marketplace/redhat-operators-f4g69" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.609768 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc"] Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.620010 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f4g69" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.643362 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa2d5976-55be-43a9-bed2-210db2b452c3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"aa2d5976-55be-43a9-bed2-210db2b452c3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.643501 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa2d5976-55be-43a9-bed2-210db2b452c3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"aa2d5976-55be-43a9-bed2-210db2b452c3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.687562 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sh7x6"] Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.688731 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sh7x6" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.696218 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sh7x6"] Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.743606 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3" path="/var/lib/kubelet/pods/0aa5ca52-2f21-4bc6-8b77-64ad427ef2a3/volumes" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.744242 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b6bae49-7ab6-4aab-bed2-8e6507bc798a" path="/var/lib/kubelet/pods/5b6bae49-7ab6-4aab-bed2-8e6507bc798a/volumes" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.744723 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d534f39e-ad8c-4216-b373-6b3041fe84d5-serving-cert\") pod \"route-controller-manager-546bb7bf67-wwvzc\" (UID: \"d534f39e-ad8c-4216-b373-6b3041fe84d5\") " pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.744807 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d534f39e-ad8c-4216-b373-6b3041fe84d5-config\") pod \"route-controller-manager-546bb7bf67-wwvzc\" (UID: \"d534f39e-ad8c-4216-b373-6b3041fe84d5\") " pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.744861 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa2d5976-55be-43a9-bed2-210db2b452c3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"aa2d5976-55be-43a9-bed2-210db2b452c3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.744882 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa2d5976-55be-43a9-bed2-210db2b452c3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"aa2d5976-55be-43a9-bed2-210db2b452c3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.744910 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tthfp\" (UniqueName: \"kubernetes.io/projected/d534f39e-ad8c-4216-b373-6b3041fe84d5-kube-api-access-tthfp\") pod \"route-controller-manager-546bb7bf67-wwvzc\" (UID: \"d534f39e-ad8c-4216-b373-6b3041fe84d5\") " pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.745104 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa2d5976-55be-43a9-bed2-210db2b452c3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"aa2d5976-55be-43a9-bed2-210db2b452c3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.745140 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d534f39e-ad8c-4216-b373-6b3041fe84d5-client-ca\") pod \"route-controller-manager-546bb7bf67-wwvzc\" (UID: \"d534f39e-ad8c-4216-b373-6b3041fe84d5\") " pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.745959 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.760922 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa2d5976-55be-43a9-bed2-210db2b452c3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"aa2d5976-55be-43a9-bed2-210db2b452c3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.849939 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hkf7\" (UniqueName: \"kubernetes.io/projected/57508484-52d2-4deb-932e-ecfdf603f0f2-kube-api-access-4hkf7\") pod \"redhat-operators-sh7x6\" (UID: \"57508484-52d2-4deb-932e-ecfdf603f0f2\") " pod="openshift-marketplace/redhat-operators-sh7x6" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.850332 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d534f39e-ad8c-4216-b373-6b3041fe84d5-serving-cert\") pod \"route-controller-manager-546bb7bf67-wwvzc\" (UID: \"d534f39e-ad8c-4216-b373-6b3041fe84d5\") " pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.850399 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d534f39e-ad8c-4216-b373-6b3041fe84d5-config\") pod \"route-controller-manager-546bb7bf67-wwvzc\" (UID: \"d534f39e-ad8c-4216-b373-6b3041fe84d5\") " pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.850457 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tthfp\" (UniqueName: \"kubernetes.io/projected/d534f39e-ad8c-4216-b373-6b3041fe84d5-kube-api-access-tthfp\") pod \"route-controller-manager-546bb7bf67-wwvzc\" (UID: \"d534f39e-ad8c-4216-b373-6b3041fe84d5\") " pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.850494 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57508484-52d2-4deb-932e-ecfdf603f0f2-catalog-content\") pod \"redhat-operators-sh7x6\" (UID: \"57508484-52d2-4deb-932e-ecfdf603f0f2\") " pod="openshift-marketplace/redhat-operators-sh7x6" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.850551 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d534f39e-ad8c-4216-b373-6b3041fe84d5-client-ca\") pod \"route-controller-manager-546bb7bf67-wwvzc\" (UID: \"d534f39e-ad8c-4216-b373-6b3041fe84d5\") " pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.850583 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57508484-52d2-4deb-932e-ecfdf603f0f2-utilities\") pod \"redhat-operators-sh7x6\" (UID: \"57508484-52d2-4deb-932e-ecfdf603f0f2\") " pod="openshift-marketplace/redhat-operators-sh7x6" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.852708 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d534f39e-ad8c-4216-b373-6b3041fe84d5-client-ca\") pod \"route-controller-manager-546bb7bf67-wwvzc\" (UID: \"d534f39e-ad8c-4216-b373-6b3041fe84d5\") " pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.852802 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d534f39e-ad8c-4216-b373-6b3041fe84d5-config\") pod \"route-controller-manager-546bb7bf67-wwvzc\" (UID: \"d534f39e-ad8c-4216-b373-6b3041fe84d5\") " pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.861115 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d534f39e-ad8c-4216-b373-6b3041fe84d5-serving-cert\") pod \"route-controller-manager-546bb7bf67-wwvzc\" (UID: \"d534f39e-ad8c-4216-b373-6b3041fe84d5\") " pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.865364 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.868114 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tthfp\" (UniqueName: \"kubernetes.io/projected/d534f39e-ad8c-4216-b373-6b3041fe84d5-kube-api-access-tthfp\") pod \"route-controller-manager-546bb7bf67-wwvzc\" (UID: \"d534f39e-ad8c-4216-b373-6b3041fe84d5\") " pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.911146 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.937964 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f4g69"] Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.951755 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57508484-52d2-4deb-932e-ecfdf603f0f2-utilities\") pod \"redhat-operators-sh7x6\" (UID: \"57508484-52d2-4deb-932e-ecfdf603f0f2\") " pod="openshift-marketplace/redhat-operators-sh7x6" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.951845 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hkf7\" (UniqueName: \"kubernetes.io/projected/57508484-52d2-4deb-932e-ecfdf603f0f2-kube-api-access-4hkf7\") pod \"redhat-operators-sh7x6\" (UID: \"57508484-52d2-4deb-932e-ecfdf603f0f2\") " pod="openshift-marketplace/redhat-operators-sh7x6" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.952243 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57508484-52d2-4deb-932e-ecfdf603f0f2-catalog-content\") pod \"redhat-operators-sh7x6\" (UID: \"57508484-52d2-4deb-932e-ecfdf603f0f2\") " pod="openshift-marketplace/redhat-operators-sh7x6" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.952511 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57508484-52d2-4deb-932e-ecfdf603f0f2-utilities\") pod \"redhat-operators-sh7x6\" (UID: \"57508484-52d2-4deb-932e-ecfdf603f0f2\") " pod="openshift-marketplace/redhat-operators-sh7x6" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.952692 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57508484-52d2-4deb-932e-ecfdf603f0f2-catalog-content\") pod \"redhat-operators-sh7x6\" (UID: \"57508484-52d2-4deb-932e-ecfdf603f0f2\") " pod="openshift-marketplace/redhat-operators-sh7x6" Feb 24 02:56:43 crc kubenswrapper[4923]: I0224 02:56:43.968180 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hkf7\" (UniqueName: \"kubernetes.io/projected/57508484-52d2-4deb-932e-ecfdf603f0f2-kube-api-access-4hkf7\") pod \"redhat-operators-sh7x6\" (UID: \"57508484-52d2-4deb-932e-ecfdf603f0f2\") " pod="openshift-marketplace/redhat-operators-sh7x6" Feb 24 02:56:43 crc kubenswrapper[4923]: W0224 02:56:43.982354 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb365e80_350f_4a1f_955c_5d73c4704241.slice/crio-59538f4838b3abea17788e0b977575d63add14727daa5c16d4fc2c73cd96470d WatchSource:0}: Error finding container 59538f4838b3abea17788e0b977575d63add14727daa5c16d4fc2c73cd96470d: Status 404 returned error can't find the container with id 59538f4838b3abea17788e0b977575d63add14727daa5c16d4fc2c73cd96470d Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.009577 4923 generic.go:334] "Generic (PLEG): container finished" podID="17fe389c-3464-4891-b53f-9e351e0b38c9" containerID="2dc454a32a2fd734e7bae5cea5f29398c42d48591a05918d78590991444d3b58" exitCode=0 Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.009660 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"17fe389c-3464-4891-b53f-9e351e0b38c9","Type":"ContainerDied","Data":"2dc454a32a2fd734e7bae5cea5f29398c42d48591a05918d78590991444d3b58"} Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.011962 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4g69" event={"ID":"db365e80-350f-4a1f-955c-5d73c4704241","Type":"ContainerStarted","Data":"59538f4838b3abea17788e0b977575d63add14727daa5c16d4fc2c73cd96470d"} Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.016638 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" event={"ID":"7227def5-b373-488f-9f56-4b6ed170751d","Type":"ContainerStarted","Data":"e4d4b5b7e93287205991690373a4db011b8c5168cbd8fc0847be41f5b249e835"} Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.017455 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.020610 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sh7x6" Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.032525 4923 generic.go:334] "Generic (PLEG): container finished" podID="3c62fecb-b531-4754-a747-902b75b2350d" containerID="398dacdccef17fa0cc364a118005253b3f1d85e6bf83e1a9af616e5aa4937b03" exitCode=0 Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.032607 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5lw8" event={"ID":"3c62fecb-b531-4754-a747-902b75b2350d","Type":"ContainerDied","Data":"398dacdccef17fa0cc364a118005253b3f1d85e6bf83e1a9af616e5aa4937b03"} Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.034242 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.034421 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.035339 4923 generic.go:334] "Generic (PLEG): container finished" podID="1f219509-d66e-49c2-bc60-9de19d02d5f0" containerID="4aeb4d5acdc7a309fe055d202f4084184416851c29510033e84b1e5ef640d0b6" exitCode=0 Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.035584 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mdqx" event={"ID":"1f219509-d66e-49c2-bc60-9de19d02d5f0","Type":"ContainerDied","Data":"4aeb4d5acdc7a309fe055d202f4084184416851c29510033e84b1e5ef640d0b6"} Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.035604 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mdqx" event={"ID":"1f219509-d66e-49c2-bc60-9de19d02d5f0","Type":"ContainerStarted","Data":"e14f7ae2786dd7c60933413105064352dab8f31f82fa962520c383f9602ba67a"} Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.037460 4923 patch_prober.go:28] interesting pod/console-f9d7485db-w5k6j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.037495 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-w5k6j" podUID="3f89d640-5e7f-473b-98e3-420780c10024" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.068479 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" podStartSLOduration=52.068462536 podStartE2EDuration="52.068462536s" podCreationTimestamp="2026-02-24 02:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:44.060816835 +0000 UTC m=+128.077887658" watchObservedRunningTime="2026-02-24 02:56:44.068462536 +0000 UTC m=+128.085533349" Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.143262 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.174624 4923 patch_prober.go:28] interesting pod/router-default-5444994796-ctnr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 02:56:44 crc kubenswrapper[4923]: [-]has-synced failed: reason withheld Feb 24 02:56:44 crc kubenswrapper[4923]: [+]process-running ok Feb 24 02:56:44 crc kubenswrapper[4923]: healthz check failed Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.174983 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ctnr7" podUID="10a4e000-d2a9-455f-a7a7-ae4d90611c29" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.215839 4923 patch_prober.go:28] interesting pod/downloads-7954f5f757-gsv9x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.215888 4923 patch_prober.go:28] interesting pod/downloads-7954f5f757-gsv9x container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.215957 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-gsv9x" podUID="c5c02b8b-cae8-4e73-9e6f-34a8120f00c2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.215893 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gsv9x" podUID="c5c02b8b-cae8-4e73-9e6f-34a8120f00c2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.236566 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc"] Feb 24 02:56:44 crc kubenswrapper[4923]: W0224 02:56:44.243644 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd534f39e_ad8c_4216_b373_6b3041fe84d5.slice/crio-dd14193cc9307ded56ace76f4f97a4ce14bcd19f18e9ee7d8400db2b983611eb WatchSource:0}: Error finding container dd14193cc9307ded56ace76f4f97a4ce14bcd19f18e9ee7d8400db2b983611eb: Status 404 returned error can't find the container with id dd14193cc9307ded56ace76f4f97a4ce14bcd19f18e9ee7d8400db2b983611eb Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.248029 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.248071 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.282650 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sh7x6"] Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.309284 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.405056 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.405111 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:44 crc kubenswrapper[4923]: I0224 02:56:44.415189 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:45 crc kubenswrapper[4923]: I0224 02:56:45.047949 4923 generic.go:334] "Generic (PLEG): container finished" podID="db365e80-350f-4a1f-955c-5d73c4704241" containerID="4c00fd99a0fbd4bb2e0ea65e141e8ba1427f09b506ed0b13f29f8f70888cfb58" exitCode=0 Feb 24 02:56:45 crc kubenswrapper[4923]: I0224 02:56:45.048241 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4g69" event={"ID":"db365e80-350f-4a1f-955c-5d73c4704241","Type":"ContainerDied","Data":"4c00fd99a0fbd4bb2e0ea65e141e8ba1427f09b506ed0b13f29f8f70888cfb58"} Feb 24 02:56:45 crc kubenswrapper[4923]: I0224 02:56:45.049622 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"aa2d5976-55be-43a9-bed2-210db2b452c3","Type":"ContainerStarted","Data":"7ecb0ddd3c073d05c686a855a73baeb8af40fb2878f648a8fee0d5b8bcf71d71"} Feb 24 02:56:45 crc kubenswrapper[4923]: I0224 02:56:45.053805 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh7x6" event={"ID":"57508484-52d2-4deb-932e-ecfdf603f0f2","Type":"ContainerStarted","Data":"e014c31910bd52420e98cd2b965bba4569db2ee6d7ddd88609f754c629fc35d1"} Feb 24 02:56:45 crc kubenswrapper[4923]: I0224 02:56:45.059589 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" event={"ID":"d534f39e-ad8c-4216-b373-6b3041fe84d5","Type":"ContainerStarted","Data":"dd14193cc9307ded56ace76f4f97a4ce14bcd19f18e9ee7d8400db2b983611eb"} Feb 24 02:56:45 crc kubenswrapper[4923]: I0224 02:56:45.065236 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l9wvb" Feb 24 02:56:45 crc kubenswrapper[4923]: I0224 02:56:45.066412 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-ll9tx" Feb 24 02:56:45 crc kubenswrapper[4923]: I0224 02:56:45.153202 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:45 crc kubenswrapper[4923]: I0224 02:56:45.163053 4923 patch_prober.go:28] interesting pod/router-default-5444994796-ctnr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 02:56:45 crc kubenswrapper[4923]: [-]has-synced failed: reason withheld Feb 24 02:56:45 crc kubenswrapper[4923]: [+]process-running ok Feb 24 02:56:45 crc kubenswrapper[4923]: healthz check failed Feb 24 02:56:45 crc kubenswrapper[4923]: I0224 02:56:45.163105 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ctnr7" podUID="10a4e000-d2a9-455f-a7a7-ae4d90611c29" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 02:56:45 crc kubenswrapper[4923]: E0224 02:56:45.273986 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a1da2c6ea0d1ed6fc17482ec19f3efcc2ae40122953a75eb0c2f51c01ec110f" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 24 02:56:45 crc kubenswrapper[4923]: E0224 02:56:45.276312 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a1da2c6ea0d1ed6fc17482ec19f3efcc2ae40122953a75eb0c2f51c01ec110f" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 24 02:56:45 crc kubenswrapper[4923]: E0224 02:56:45.292601 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a1da2c6ea0d1ed6fc17482ec19f3efcc2ae40122953a75eb0c2f51c01ec110f" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 24 02:56:45 crc kubenswrapper[4923]: E0224 02:56:45.292668 4923 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" podUID="30e1ab71-a068-4593-9dc7-f1f7731caeb9" containerName="kube-multus-additional-cni-plugins" Feb 24 02:56:45 crc kubenswrapper[4923]: I0224 02:56:45.445467 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 02:56:45 crc kubenswrapper[4923]: I0224 02:56:45.605910 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17fe389c-3464-4891-b53f-9e351e0b38c9-kubelet-dir\") pod \"17fe389c-3464-4891-b53f-9e351e0b38c9\" (UID: \"17fe389c-3464-4891-b53f-9e351e0b38c9\") " Feb 24 02:56:45 crc kubenswrapper[4923]: I0224 02:56:45.606212 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17fe389c-3464-4891-b53f-9e351e0b38c9-kube-api-access\") pod \"17fe389c-3464-4891-b53f-9e351e0b38c9\" (UID: \"17fe389c-3464-4891-b53f-9e351e0b38c9\") " Feb 24 02:56:45 crc kubenswrapper[4923]: I0224 02:56:45.606100 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17fe389c-3464-4891-b53f-9e351e0b38c9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "17fe389c-3464-4891-b53f-9e351e0b38c9" (UID: "17fe389c-3464-4891-b53f-9e351e0b38c9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 02:56:45 crc kubenswrapper[4923]: I0224 02:56:45.606533 4923 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17fe389c-3464-4891-b53f-9e351e0b38c9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:45 crc kubenswrapper[4923]: I0224 02:56:45.613475 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17fe389c-3464-4891-b53f-9e351e0b38c9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "17fe389c-3464-4891-b53f-9e351e0b38c9" (UID: "17fe389c-3464-4891-b53f-9e351e0b38c9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:45 crc kubenswrapper[4923]: I0224 02:56:45.723419 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17fe389c-3464-4891-b53f-9e351e0b38c9-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:46 crc kubenswrapper[4923]: I0224 02:56:46.076880 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"17fe389c-3464-4891-b53f-9e351e0b38c9","Type":"ContainerDied","Data":"a9a2191eff978b01d3d90cfc8beb17dc359756af6c457cf23483888077e97512"} Feb 24 02:56:46 crc kubenswrapper[4923]: I0224 02:56:46.076946 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9a2191eff978b01d3d90cfc8beb17dc359756af6c457cf23483888077e97512" Feb 24 02:56:46 crc kubenswrapper[4923]: I0224 02:56:46.076903 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 02:56:46 crc kubenswrapper[4923]: I0224 02:56:46.083827 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" event={"ID":"d534f39e-ad8c-4216-b373-6b3041fe84d5","Type":"ContainerStarted","Data":"ebf777fe73f40c53eb29c662ba14224e50d11a153fac9c1e1b087a34b6fa83dd"} Feb 24 02:56:46 crc kubenswrapper[4923]: I0224 02:56:46.084904 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" Feb 24 02:56:46 crc kubenswrapper[4923]: I0224 02:56:46.094184 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" Feb 24 02:56:46 crc kubenswrapper[4923]: I0224 02:56:46.098672 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"aa2d5976-55be-43a9-bed2-210db2b452c3","Type":"ContainerStarted","Data":"6ff4f1220459ce32bed04c10a265669595e955b2df7d3cf6e1d65310ff036e76"} Feb 24 02:56:46 crc kubenswrapper[4923]: I0224 02:56:46.126146 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" podStartSLOduration=8.126126267 podStartE2EDuration="8.126126267s" podCreationTimestamp="2026-02-24 02:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:46.124805217 +0000 UTC m=+130.141876050" watchObservedRunningTime="2026-02-24 02:56:46.126126267 +0000 UTC m=+130.143197080" Feb 24 02:56:46 crc kubenswrapper[4923]: I0224 02:56:46.135071 4923 generic.go:334] "Generic (PLEG): container finished" podID="57508484-52d2-4deb-932e-ecfdf603f0f2" containerID="d89bf519ffd666a73f8b19bc4599f54e96e36bb35495d84b175d5a6193209328" exitCode=0 Feb 24 02:56:46 crc kubenswrapper[4923]: I0224 02:56:46.136343 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh7x6" event={"ID":"57508484-52d2-4deb-932e-ecfdf603f0f2","Type":"ContainerDied","Data":"d89bf519ffd666a73f8b19bc4599f54e96e36bb35495d84b175d5a6193209328"} Feb 24 02:56:46 crc kubenswrapper[4923]: I0224 02:56:46.146974 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.146953683 podStartE2EDuration="3.146953683s" podCreationTimestamp="2026-02-24 02:56:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:56:46.145849889 +0000 UTC m=+130.162920702" watchObservedRunningTime="2026-02-24 02:56:46.146953683 +0000 UTC m=+130.164024496" Feb 24 02:56:46 crc kubenswrapper[4923]: I0224 02:56:46.156460 4923 patch_prober.go:28] interesting pod/router-default-5444994796-ctnr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 02:56:46 crc kubenswrapper[4923]: [-]has-synced failed: reason withheld Feb 24 02:56:46 crc kubenswrapper[4923]: [+]process-running ok Feb 24 02:56:46 crc kubenswrapper[4923]: healthz check failed Feb 24 02:56:46 crc kubenswrapper[4923]: I0224 02:56:46.156501 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ctnr7" podUID="10a4e000-d2a9-455f-a7a7-ae4d90611c29" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.154763 4923 patch_prober.go:28] interesting pod/router-default-5444994796-ctnr7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 02:56:47 crc kubenswrapper[4923]: [+]has-synced ok Feb 24 02:56:47 crc kubenswrapper[4923]: [+]process-running ok Feb 24 02:56:47 crc kubenswrapper[4923]: healthz check failed Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.154823 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ctnr7" podUID="10a4e000-d2a9-455f-a7a7-ae4d90611c29" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.166146 4923 generic.go:334] "Generic (PLEG): container finished" podID="aa2d5976-55be-43a9-bed2-210db2b452c3" containerID="6ff4f1220459ce32bed04c10a265669595e955b2df7d3cf6e1d65310ff036e76" exitCode=0 Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.167166 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"aa2d5976-55be-43a9-bed2-210db2b452c3","Type":"ContainerDied","Data":"6ff4f1220459ce32bed04c10a265669595e955b2df7d3cf6e1d65310ff036e76"} Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.266638 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9jk4q" Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.550018 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.550107 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.550192 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.550220 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.552717 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.552743 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.552978 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.566042 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.566378 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.567626 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.574841 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.576251 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.737061 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.849695 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:56:47 crc kubenswrapper[4923]: I0224 02:56:47.861584 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 02:56:48 crc kubenswrapper[4923]: I0224 02:56:48.156024 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:48 crc kubenswrapper[4923]: I0224 02:56:48.158698 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-ctnr7" Feb 24 02:56:49 crc kubenswrapper[4923]: I0224 02:56:49.188639 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"43f313c33721711e0c18206dc3ddb80ceff473c546b22cb0e304e65131bc85ff"} Feb 24 02:56:51 crc kubenswrapper[4923]: I0224 02:56:51.485658 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 02:56:54 crc kubenswrapper[4923]: I0224 02:56:54.040645 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:54 crc kubenswrapper[4923]: I0224 02:56:54.047898 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 02:56:54 crc kubenswrapper[4923]: I0224 02:56:54.215336 4923 patch_prober.go:28] interesting pod/downloads-7954f5f757-gsv9x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 24 02:56:54 crc kubenswrapper[4923]: I0224 02:56:54.215391 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gsv9x" podUID="c5c02b8b-cae8-4e73-9e6f-34a8120f00c2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 24 02:56:54 crc kubenswrapper[4923]: I0224 02:56:54.215488 4923 patch_prober.go:28] interesting pod/downloads-7954f5f757-gsv9x container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Feb 24 02:56:54 crc kubenswrapper[4923]: I0224 02:56:54.215512 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-gsv9x" podUID="c5c02b8b-cae8-4e73-9e6f-34a8120f00c2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Feb 24 02:56:54 crc kubenswrapper[4923]: I0224 02:56:54.742151 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:56:54 crc kubenswrapper[4923]: W0224 02:56:54.945498 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-6f31975cc19c891d7fe31688f8b475a605c4f12a86bc3d61fecada8aa237da16 WatchSource:0}: Error finding container 6f31975cc19c891d7fe31688f8b475a605c4f12a86bc3d61fecada8aa237da16: Status 404 returned error can't find the container with id 6f31975cc19c891d7fe31688f8b475a605c4f12a86bc3d61fecada8aa237da16 Feb 24 02:56:54 crc kubenswrapper[4923]: I0224 02:56:54.992443 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 02:56:55 crc kubenswrapper[4923]: I0224 02:56:55.173319 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa2d5976-55be-43a9-bed2-210db2b452c3-kubelet-dir\") pod \"aa2d5976-55be-43a9-bed2-210db2b452c3\" (UID: \"aa2d5976-55be-43a9-bed2-210db2b452c3\") " Feb 24 02:56:55 crc kubenswrapper[4923]: I0224 02:56:55.173360 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa2d5976-55be-43a9-bed2-210db2b452c3-kube-api-access\") pod \"aa2d5976-55be-43a9-bed2-210db2b452c3\" (UID: \"aa2d5976-55be-43a9-bed2-210db2b452c3\") " Feb 24 02:56:55 crc kubenswrapper[4923]: I0224 02:56:55.173489 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aa2d5976-55be-43a9-bed2-210db2b452c3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "aa2d5976-55be-43a9-bed2-210db2b452c3" (UID: "aa2d5976-55be-43a9-bed2-210db2b452c3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 02:56:55 crc kubenswrapper[4923]: I0224 02:56:55.173931 4923 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aa2d5976-55be-43a9-bed2-210db2b452c3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 24 02:56:55 crc kubenswrapper[4923]: I0224 02:56:55.182368 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa2d5976-55be-43a9-bed2-210db2b452c3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "aa2d5976-55be-43a9-bed2-210db2b452c3" (UID: "aa2d5976-55be-43a9-bed2-210db2b452c3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:56:55 crc kubenswrapper[4923]: I0224 02:56:55.224998 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"aa2d5976-55be-43a9-bed2-210db2b452c3","Type":"ContainerDied","Data":"7ecb0ddd3c073d05c686a855a73baeb8af40fb2878f648a8fee0d5b8bcf71d71"} Feb 24 02:56:55 crc kubenswrapper[4923]: I0224 02:56:55.225040 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ecb0ddd3c073d05c686a855a73baeb8af40fb2878f648a8fee0d5b8bcf71d71" Feb 24 02:56:55 crc kubenswrapper[4923]: I0224 02:56:55.225051 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 02:56:55 crc kubenswrapper[4923]: I0224 02:56:55.226090 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ad90e0bc565fa3b2d8d78a885dc7a538926da6722c5033c0f885d4d8897eaef4"} Feb 24 02:56:55 crc kubenswrapper[4923]: I0224 02:56:55.227484 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"6f31975cc19c891d7fe31688f8b475a605c4f12a86bc3d61fecada8aa237da16"} Feb 24 02:56:55 crc kubenswrapper[4923]: E0224 02:56:55.237630 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a1da2c6ea0d1ed6fc17482ec19f3efcc2ae40122953a75eb0c2f51c01ec110f" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 24 02:56:55 crc kubenswrapper[4923]: E0224 02:56:55.239512 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a1da2c6ea0d1ed6fc17482ec19f3efcc2ae40122953a75eb0c2f51c01ec110f" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 24 02:56:55 crc kubenswrapper[4923]: E0224 02:56:55.242105 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a1da2c6ea0d1ed6fc17482ec19f3efcc2ae40122953a75eb0c2f51c01ec110f" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 24 02:56:55 crc kubenswrapper[4923]: E0224 02:56:55.242155 4923 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" podUID="30e1ab71-a068-4593-9dc7-f1f7731caeb9" containerName="kube-multus-additional-cni-plugins" Feb 24 02:56:55 crc kubenswrapper[4923]: I0224 02:56:55.275099 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa2d5976-55be-43a9-bed2-210db2b452c3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:00 crc kubenswrapper[4923]: I0224 02:57:00.723429 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 24 02:57:02 crc kubenswrapper[4923]: I0224 02:57:02.357698 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 02:57:02 crc kubenswrapper[4923]: I0224 02:57:02.373038 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=2.37302195 podStartE2EDuration="2.37302195s" podCreationTimestamp="2026-02-24 02:57:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:57:02.368979857 +0000 UTC m=+146.386050670" watchObservedRunningTime="2026-02-24 02:57:02.37302195 +0000 UTC m=+146.390092763" Feb 24 02:57:04 crc kubenswrapper[4923]: I0224 02:57:04.223487 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-gsv9x" Feb 24 02:57:05 crc kubenswrapper[4923]: E0224 02:57:05.240716 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a1da2c6ea0d1ed6fc17482ec19f3efcc2ae40122953a75eb0c2f51c01ec110f" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 24 02:57:05 crc kubenswrapper[4923]: E0224 02:57:05.242409 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a1da2c6ea0d1ed6fc17482ec19f3efcc2ae40122953a75eb0c2f51c01ec110f" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 24 02:57:05 crc kubenswrapper[4923]: E0224 02:57:05.243678 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a1da2c6ea0d1ed6fc17482ec19f3efcc2ae40122953a75eb0c2f51c01ec110f" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 24 02:57:05 crc kubenswrapper[4923]: E0224 02:57:05.243723 4923 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" podUID="30e1ab71-a068-4593-9dc7-f1f7731caeb9" containerName="kube-multus-additional-cni-plugins" Feb 24 02:57:05 crc kubenswrapper[4923]: I0224 02:57:05.728504 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 24 02:57:07 crc kubenswrapper[4923]: I0224 02:57:07.728800 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=2.72877714 podStartE2EDuration="2.72877714s" podCreationTimestamp="2026-02-24 02:57:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:57:07.727931645 +0000 UTC m=+151.745002458" watchObservedRunningTime="2026-02-24 02:57:07.72877714 +0000 UTC m=+151.745847963" Feb 24 02:57:10 crc kubenswrapper[4923]: E0224 02:57:10.403365 4923 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 24 02:57:10 crc kubenswrapper[4923]: E0224 02:57:10.403844 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-75dbv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-qpqqp_openshift-marketplace(844ee205-faee-4873-978e-cf3d64cd8397): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 02:57:10 crc kubenswrapper[4923]: E0224 02:57:10.405893 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-qpqqp" podUID="844ee205-faee-4873-978e-cf3d64cd8397" Feb 24 02:57:13 crc kubenswrapper[4923]: E0224 02:57:13.221347 4923 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 24 02:57:13 crc kubenswrapper[4923]: E0224 02:57:13.221599 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gf9mj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-k5lw8_openshift-marketplace(3c62fecb-b531-4754-a747-902b75b2350d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 02:57:13 crc kubenswrapper[4923]: E0224 02:57:13.222757 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-k5lw8" podUID="3c62fecb-b531-4754-a747-902b75b2350d" Feb 24 02:57:13 crc kubenswrapper[4923]: I0224 02:57:13.329459 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-zswn9_30e1ab71-a068-4593-9dc7-f1f7731caeb9/kube-multus-additional-cni-plugins/0.log" Feb 24 02:57:13 crc kubenswrapper[4923]: I0224 02:57:13.329511 4923 generic.go:334] "Generic (PLEG): container finished" podID="30e1ab71-a068-4593-9dc7-f1f7731caeb9" containerID="1a1da2c6ea0d1ed6fc17482ec19f3efcc2ae40122953a75eb0c2f51c01ec110f" exitCode=137 Feb 24 02:57:13 crc kubenswrapper[4923]: I0224 02:57:13.329797 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" event={"ID":"30e1ab71-a068-4593-9dc7-f1f7731caeb9","Type":"ContainerDied","Data":"1a1da2c6ea0d1ed6fc17482ec19f3efcc2ae40122953a75eb0c2f51c01ec110f"} Feb 24 02:57:13 crc kubenswrapper[4923]: E0224 02:57:13.412841 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-qpqqp" podUID="844ee205-faee-4873-978e-cf3d64cd8397" Feb 24 02:57:13 crc kubenswrapper[4923]: E0224 02:57:13.412900 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-k5lw8" podUID="3c62fecb-b531-4754-a747-902b75b2350d" Feb 24 02:57:13 crc kubenswrapper[4923]: E0224 02:57:13.611070 4923 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 24 02:57:13 crc kubenswrapper[4923]: E0224 02:57:13.611653 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xvwx8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-77dzs_openshift-marketplace(770b43e6-56e5-4d30-9d35-f3ce4dcf3563): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 02:57:13 crc kubenswrapper[4923]: E0224 02:57:13.613019 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-77dzs" podUID="770b43e6-56e5-4d30-9d35-f3ce4dcf3563" Feb 24 02:57:13 crc kubenswrapper[4923]: E0224 02:57:13.631348 4923 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 24 02:57:13 crc kubenswrapper[4923]: E0224 02:57:13.631536 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7hk45,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bccxn_openshift-marketplace(cba8789a-c6f5-4fb3-93a9-ec12a41dba0b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 02:57:13 crc kubenswrapper[4923]: E0224 02:57:13.632778 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bccxn" podUID="cba8789a-c6f5-4fb3-93a9-ec12a41dba0b" Feb 24 02:57:13 crc kubenswrapper[4923]: I0224 02:57:13.750961 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-zswn9_30e1ab71-a068-4593-9dc7-f1f7731caeb9/kube-multus-additional-cni-plugins/0.log" Feb 24 02:57:13 crc kubenswrapper[4923]: I0224 02:57:13.751022 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" Feb 24 02:57:13 crc kubenswrapper[4923]: I0224 02:57:13.945576 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvd29\" (UniqueName: \"kubernetes.io/projected/30e1ab71-a068-4593-9dc7-f1f7731caeb9-kube-api-access-wvd29\") pod \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\" (UID: \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\") " Feb 24 02:57:13 crc kubenswrapper[4923]: I0224 02:57:13.945694 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/30e1ab71-a068-4593-9dc7-f1f7731caeb9-ready\") pod \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\" (UID: \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\") " Feb 24 02:57:13 crc kubenswrapper[4923]: I0224 02:57:13.945732 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/30e1ab71-a068-4593-9dc7-f1f7731caeb9-tuning-conf-dir\") pod \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\" (UID: \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\") " Feb 24 02:57:13 crc kubenswrapper[4923]: I0224 02:57:13.945799 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/30e1ab71-a068-4593-9dc7-f1f7731caeb9-cni-sysctl-allowlist\") pod \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\" (UID: \"30e1ab71-a068-4593-9dc7-f1f7731caeb9\") " Feb 24 02:57:13 crc kubenswrapper[4923]: I0224 02:57:13.945851 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30e1ab71-a068-4593-9dc7-f1f7731caeb9-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "30e1ab71-a068-4593-9dc7-f1f7731caeb9" (UID: "30e1ab71-a068-4593-9dc7-f1f7731caeb9"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 02:57:13 crc kubenswrapper[4923]: I0224 02:57:13.946107 4923 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/30e1ab71-a068-4593-9dc7-f1f7731caeb9-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:13 crc kubenswrapper[4923]: I0224 02:57:13.946276 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30e1ab71-a068-4593-9dc7-f1f7731caeb9-ready" (OuterVolumeSpecName: "ready") pod "30e1ab71-a068-4593-9dc7-f1f7731caeb9" (UID: "30e1ab71-a068-4593-9dc7-f1f7731caeb9"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:57:13 crc kubenswrapper[4923]: I0224 02:57:13.946599 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30e1ab71-a068-4593-9dc7-f1f7731caeb9-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "30e1ab71-a068-4593-9dc7-f1f7731caeb9" (UID: "30e1ab71-a068-4593-9dc7-f1f7731caeb9"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:57:13 crc kubenswrapper[4923]: I0224 02:57:13.951847 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e1ab71-a068-4593-9dc7-f1f7731caeb9-kube-api-access-wvd29" (OuterVolumeSpecName: "kube-api-access-wvd29") pod "30e1ab71-a068-4593-9dc7-f1f7731caeb9" (UID: "30e1ab71-a068-4593-9dc7-f1f7731caeb9"). InnerVolumeSpecName "kube-api-access-wvd29". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:57:14 crc kubenswrapper[4923]: I0224 02:57:14.047444 4923 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/30e1ab71-a068-4593-9dc7-f1f7731caeb9-ready\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:14 crc kubenswrapper[4923]: I0224 02:57:14.047474 4923 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/30e1ab71-a068-4593-9dc7-f1f7731caeb9-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:14 crc kubenswrapper[4923]: I0224 02:57:14.047484 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvd29\" (UniqueName: \"kubernetes.io/projected/30e1ab71-a068-4593-9dc7-f1f7731caeb9-kube-api-access-wvd29\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:14 crc kubenswrapper[4923]: I0224 02:57:14.349956 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5ff6754d2edadeac3f0ee27812eb55983cc18a282c653088a33693761cfec67e"} Feb 24 02:57:14 crc kubenswrapper[4923]: I0224 02:57:14.354796 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-zswn9_30e1ab71-a068-4593-9dc7-f1f7731caeb9/kube-multus-additional-cni-plugins/0.log" Feb 24 02:57:14 crc kubenswrapper[4923]: I0224 02:57:14.354910 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" event={"ID":"30e1ab71-a068-4593-9dc7-f1f7731caeb9","Type":"ContainerDied","Data":"84b23c7581da947b6bdb888347cef89e61e753e150cf52164db9e3aefcc56858"} Feb 24 02:57:14 crc kubenswrapper[4923]: I0224 02:57:14.354921 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-zswn9" Feb 24 02:57:14 crc kubenswrapper[4923]: I0224 02:57:14.354960 4923 scope.go:117] "RemoveContainer" containerID="1a1da2c6ea0d1ed6fc17482ec19f3efcc2ae40122953a75eb0c2f51c01ec110f" Feb 24 02:57:14 crc kubenswrapper[4923]: I0224 02:57:14.357104 4923 generic.go:334] "Generic (PLEG): container finished" podID="1f219509-d66e-49c2-bc60-9de19d02d5f0" containerID="d1cb145ca9fdea29fb2a67fb6695bebc02b6c2ddc6ec1dc846fdc74c9eb377b3" exitCode=0 Feb 24 02:57:14 crc kubenswrapper[4923]: I0224 02:57:14.357146 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mdqx" event={"ID":"1f219509-d66e-49c2-bc60-9de19d02d5f0","Type":"ContainerDied","Data":"d1cb145ca9fdea29fb2a67fb6695bebc02b6c2ddc6ec1dc846fdc74c9eb377b3"} Feb 24 02:57:14 crc kubenswrapper[4923]: I0224 02:57:14.358732 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2b7319cc7f3d6d14cafaa4e4a5b6453e3e84ea177cadbdc505facde0b9dd37cc"} Feb 24 02:57:14 crc kubenswrapper[4923]: I0224 02:57:14.359052 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:57:14 crc kubenswrapper[4923]: I0224 02:57:14.360786 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1b0b44d01914b6c4392a9a2a976febc1627d052ab787c7996c2dbeab7bfed14d"} Feb 24 02:57:14 crc kubenswrapper[4923]: I0224 02:57:14.477002 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-zswn9"] Feb 24 02:57:14 crc kubenswrapper[4923]: I0224 02:57:14.481721 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-zswn9"] Feb 24 02:57:15 crc kubenswrapper[4923]: I0224 02:57:15.183324 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-74xcz" Feb 24 02:57:15 crc kubenswrapper[4923]: I0224 02:57:15.718427 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30e1ab71-a068-4593-9dc7-f1f7731caeb9" path="/var/lib/kubelet/pods/30e1ab71-a068-4593-9dc7-f1f7731caeb9/volumes" Feb 24 02:57:17 crc kubenswrapper[4923]: E0224 02:57:17.421613 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-77dzs" podUID="770b43e6-56e5-4d30-9d35-f3ce4dcf3563" Feb 24 02:57:17 crc kubenswrapper[4923]: E0224 02:57:17.422519 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bccxn" podUID="cba8789a-c6f5-4fb3-93a9-ec12a41dba0b" Feb 24 02:57:18 crc kubenswrapper[4923]: I0224 02:57:18.385288 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mdqx" event={"ID":"1f219509-d66e-49c2-bc60-9de19d02d5f0","Type":"ContainerStarted","Data":"ff45e7f9d06aea47a855ff43c062c89f708c84ca7cebceb10c88ad9e7aef0596"} Feb 24 02:57:18 crc kubenswrapper[4923]: I0224 02:57:18.386784 4923 generic.go:334] "Generic (PLEG): container finished" podID="57508484-52d2-4deb-932e-ecfdf603f0f2" containerID="0721cc11bbb1d07209505099f753e610bbd35eeb802b5f6ab6ebc26b82aa8f7c" exitCode=0 Feb 24 02:57:18 crc kubenswrapper[4923]: I0224 02:57:18.386805 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh7x6" event={"ID":"57508484-52d2-4deb-932e-ecfdf603f0f2","Type":"ContainerDied","Data":"0721cc11bbb1d07209505099f753e610bbd35eeb802b5f6ab6ebc26b82aa8f7c"} Feb 24 02:57:18 crc kubenswrapper[4923]: I0224 02:57:18.388271 4923 generic.go:334] "Generic (PLEG): container finished" podID="906acf28-a57e-4f51-816e-5936cba1548f" containerID="6b9c908a4e255ca83d91e64b8b20484f264be8075f44544eeef8a3488e63ce72" exitCode=0 Feb 24 02:57:18 crc kubenswrapper[4923]: I0224 02:57:18.388331 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4m2g" event={"ID":"906acf28-a57e-4f51-816e-5936cba1548f","Type":"ContainerDied","Data":"6b9c908a4e255ca83d91e64b8b20484f264be8075f44544eeef8a3488e63ce72"} Feb 24 02:57:18 crc kubenswrapper[4923]: I0224 02:57:18.390427 4923 generic.go:334] "Generic (PLEG): container finished" podID="db365e80-350f-4a1f-955c-5d73c4704241" containerID="68b4fb56c83c0e013c92f2b4081051f55caad5954d96abeef76952b6b1026995" exitCode=0 Feb 24 02:57:18 crc kubenswrapper[4923]: I0224 02:57:18.390466 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4g69" event={"ID":"db365e80-350f-4a1f-955c-5d73c4704241","Type":"ContainerDied","Data":"68b4fb56c83c0e013c92f2b4081051f55caad5954d96abeef76952b6b1026995"} Feb 24 02:57:18 crc kubenswrapper[4923]: I0224 02:57:18.407482 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4mdqx" podStartSLOduration=2.343983501 podStartE2EDuration="36.407459809s" podCreationTimestamp="2026-02-24 02:56:42 +0000 UTC" firstStartedPulling="2026-02-24 02:56:44.039889653 +0000 UTC m=+128.056960456" lastFinishedPulling="2026-02-24 02:57:18.103365951 +0000 UTC m=+162.120436764" observedRunningTime="2026-02-24 02:57:18.400980032 +0000 UTC m=+162.418050845" watchObservedRunningTime="2026-02-24 02:57:18.407459809 +0000 UTC m=+162.424530622" Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.398125 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4m2g" event={"ID":"906acf28-a57e-4f51-816e-5936cba1548f","Type":"ContainerStarted","Data":"24cf1d644eee8cce38b00bac8493aa1c47285f6800d5fe92a39a3698ab45af8c"} Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.403021 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4g69" event={"ID":"db365e80-350f-4a1f-955c-5d73c4704241","Type":"ContainerStarted","Data":"122c6c6aba7c038b66adad62c1db37a7bb5b010cd818d138a606a5282dda92df"} Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.412772 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh7x6" event={"ID":"57508484-52d2-4deb-932e-ecfdf603f0f2","Type":"ContainerStarted","Data":"c7b1c39c136dd90f95c5c54fce670e79d9bbec840ad54cb456867e5ec003e16c"} Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.430635 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t4m2g" podStartSLOduration=3.448922559 podStartE2EDuration="40.430617797s" podCreationTimestamp="2026-02-24 02:56:39 +0000 UTC" firstStartedPulling="2026-02-24 02:56:41.853283816 +0000 UTC m=+125.870354629" lastFinishedPulling="2026-02-24 02:57:18.834979054 +0000 UTC m=+162.852049867" observedRunningTime="2026-02-24 02:57:19.421067366 +0000 UTC m=+163.438138189" watchObservedRunningTime="2026-02-24 02:57:19.430617797 +0000 UTC m=+163.447688610" Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.473767 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sh7x6" podStartSLOduration=3.78892849 podStartE2EDuration="36.473746633s" podCreationTimestamp="2026-02-24 02:56:43 +0000 UTC" firstStartedPulling="2026-02-24 02:56:46.141002651 +0000 UTC m=+130.158073454" lastFinishedPulling="2026-02-24 02:57:18.825820784 +0000 UTC m=+162.842891597" observedRunningTime="2026-02-24 02:57:19.445733609 +0000 UTC m=+163.462804422" watchObservedRunningTime="2026-02-24 02:57:19.473746633 +0000 UTC m=+163.490817446" Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.474083 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f4g69" podStartSLOduration=3.820169213 podStartE2EDuration="36.474075883s" podCreationTimestamp="2026-02-24 02:56:43 +0000 UTC" firstStartedPulling="2026-02-24 02:56:46.140447884 +0000 UTC m=+130.157518697" lastFinishedPulling="2026-02-24 02:57:18.794354554 +0000 UTC m=+162.811425367" observedRunningTime="2026-02-24 02:57:19.467313387 +0000 UTC m=+163.484384210" watchObservedRunningTime="2026-02-24 02:57:19.474075883 +0000 UTC m=+163.491146686" Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.738893 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 24 02:57:19 crc kubenswrapper[4923]: E0224 02:57:19.739168 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17fe389c-3464-4891-b53f-9e351e0b38c9" containerName="pruner" Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.739190 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="17fe389c-3464-4891-b53f-9e351e0b38c9" containerName="pruner" Feb 24 02:57:19 crc kubenswrapper[4923]: E0224 02:57:19.739209 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e1ab71-a068-4593-9dc7-f1f7731caeb9" containerName="kube-multus-additional-cni-plugins" Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.739217 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e1ab71-a068-4593-9dc7-f1f7731caeb9" containerName="kube-multus-additional-cni-plugins" Feb 24 02:57:19 crc kubenswrapper[4923]: E0224 02:57:19.739235 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2d5976-55be-43a9-bed2-210db2b452c3" containerName="pruner" Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.739243 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2d5976-55be-43a9-bed2-210db2b452c3" containerName="pruner" Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.739377 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="17fe389c-3464-4891-b53f-9e351e0b38c9" containerName="pruner" Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.739396 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa2d5976-55be-43a9-bed2-210db2b452c3" containerName="pruner" Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.739407 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e1ab71-a068-4593-9dc7-f1f7731caeb9" containerName="kube-multus-additional-cni-plugins" Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.739841 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.742255 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.742284 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.753055 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.926254 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27447569-e0b5-4db6-915e-fe063cf88e81-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"27447569-e0b5-4db6-915e-fe063cf88e81\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 02:57:19 crc kubenswrapper[4923]: I0224 02:57:19.926642 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27447569-e0b5-4db6-915e-fe063cf88e81-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"27447569-e0b5-4db6-915e-fe063cf88e81\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 02:57:20 crc kubenswrapper[4923]: I0224 02:57:20.027854 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27447569-e0b5-4db6-915e-fe063cf88e81-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"27447569-e0b5-4db6-915e-fe063cf88e81\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 02:57:20 crc kubenswrapper[4923]: I0224 02:57:20.027900 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27447569-e0b5-4db6-915e-fe063cf88e81-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"27447569-e0b5-4db6-915e-fe063cf88e81\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 02:57:20 crc kubenswrapper[4923]: I0224 02:57:20.028002 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27447569-e0b5-4db6-915e-fe063cf88e81-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"27447569-e0b5-4db6-915e-fe063cf88e81\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 02:57:20 crc kubenswrapper[4923]: I0224 02:57:20.053054 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27447569-e0b5-4db6-915e-fe063cf88e81-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"27447569-e0b5-4db6-915e-fe063cf88e81\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 02:57:20 crc kubenswrapper[4923]: I0224 02:57:20.054924 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 02:57:20 crc kubenswrapper[4923]: I0224 02:57:20.395695 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t4m2g" Feb 24 02:57:20 crc kubenswrapper[4923]: I0224 02:57:20.395985 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t4m2g" Feb 24 02:57:20 crc kubenswrapper[4923]: I0224 02:57:20.536386 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 24 02:57:21 crc kubenswrapper[4923]: I0224 02:57:21.425482 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"27447569-e0b5-4db6-915e-fe063cf88e81","Type":"ContainerStarted","Data":"50adfd7e97e81b13a8b1ded18ebf4fa6513252694bf9253dd7381ef393f4eab5"} Feb 24 02:57:21 crc kubenswrapper[4923]: I0224 02:57:21.425771 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"27447569-e0b5-4db6-915e-fe063cf88e81","Type":"ContainerStarted","Data":"7f9c2cd397dc7ea23bdd33ae55a6a5de0e469415bbe79f20ca0f8e39cd156d09"} Feb 24 02:57:21 crc kubenswrapper[4923]: I0224 02:57:21.441623 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.441607005 podStartE2EDuration="2.441607005s" podCreationTimestamp="2026-02-24 02:57:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:57:21.441089179 +0000 UTC m=+165.458160002" watchObservedRunningTime="2026-02-24 02:57:21.441607005 +0000 UTC m=+165.458677818" Feb 24 02:57:21 crc kubenswrapper[4923]: I0224 02:57:21.573159 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-t4m2g" podUID="906acf28-a57e-4f51-816e-5936cba1548f" containerName="registry-server" probeResult="failure" output=< Feb 24 02:57:21 crc kubenswrapper[4923]: timeout: failed to connect service ":50051" within 1s Feb 24 02:57:21 crc kubenswrapper[4923]: > Feb 24 02:57:22 crc kubenswrapper[4923]: I0224 02:57:22.430679 4923 generic.go:334] "Generic (PLEG): container finished" podID="27447569-e0b5-4db6-915e-fe063cf88e81" containerID="50adfd7e97e81b13a8b1ded18ebf4fa6513252694bf9253dd7381ef393f4eab5" exitCode=0 Feb 24 02:57:22 crc kubenswrapper[4923]: I0224 02:57:22.430726 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"27447569-e0b5-4db6-915e-fe063cf88e81","Type":"ContainerDied","Data":"50adfd7e97e81b13a8b1ded18ebf4fa6513252694bf9253dd7381ef393f4eab5"} Feb 24 02:57:22 crc kubenswrapper[4923]: I0224 02:57:22.601517 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4mdqx" Feb 24 02:57:22 crc kubenswrapper[4923]: I0224 02:57:22.601595 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4mdqx" Feb 24 02:57:22 crc kubenswrapper[4923]: I0224 02:57:22.652159 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4mdqx" Feb 24 02:57:23 crc kubenswrapper[4923]: I0224 02:57:23.473799 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4mdqx" Feb 24 02:57:23 crc kubenswrapper[4923]: I0224 02:57:23.621319 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f4g69" Feb 24 02:57:23 crc kubenswrapper[4923]: I0224 02:57:23.624473 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f4g69" Feb 24 02:57:23 crc kubenswrapper[4923]: I0224 02:57:23.693942 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 02:57:23 crc kubenswrapper[4923]: I0224 02:57:23.779989 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27447569-e0b5-4db6-915e-fe063cf88e81-kubelet-dir\") pod \"27447569-e0b5-4db6-915e-fe063cf88e81\" (UID: \"27447569-e0b5-4db6-915e-fe063cf88e81\") " Feb 24 02:57:23 crc kubenswrapper[4923]: I0224 02:57:23.780051 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27447569-e0b5-4db6-915e-fe063cf88e81-kube-api-access\") pod \"27447569-e0b5-4db6-915e-fe063cf88e81\" (UID: \"27447569-e0b5-4db6-915e-fe063cf88e81\") " Feb 24 02:57:23 crc kubenswrapper[4923]: I0224 02:57:23.780122 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27447569-e0b5-4db6-915e-fe063cf88e81-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "27447569-e0b5-4db6-915e-fe063cf88e81" (UID: "27447569-e0b5-4db6-915e-fe063cf88e81"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 02:57:23 crc kubenswrapper[4923]: I0224 02:57:23.780349 4923 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27447569-e0b5-4db6-915e-fe063cf88e81-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:23 crc kubenswrapper[4923]: I0224 02:57:23.790540 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27447569-e0b5-4db6-915e-fe063cf88e81-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "27447569-e0b5-4db6-915e-fe063cf88e81" (UID: "27447569-e0b5-4db6-915e-fe063cf88e81"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:57:23 crc kubenswrapper[4923]: I0224 02:57:23.880796 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27447569-e0b5-4db6-915e-fe063cf88e81-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:24 crc kubenswrapper[4923]: I0224 02:57:24.021332 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sh7x6" Feb 24 02:57:24 crc kubenswrapper[4923]: I0224 02:57:24.021415 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sh7x6" Feb 24 02:57:24 crc kubenswrapper[4923]: I0224 02:57:24.440925 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"27447569-e0b5-4db6-915e-fe063cf88e81","Type":"ContainerDied","Data":"7f9c2cd397dc7ea23bdd33ae55a6a5de0e469415bbe79f20ca0f8e39cd156d09"} Feb 24 02:57:24 crc kubenswrapper[4923]: I0224 02:57:24.440963 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 02:57:24 crc kubenswrapper[4923]: I0224 02:57:24.441028 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f9c2cd397dc7ea23bdd33ae55a6a5de0e469415bbe79f20ca0f8e39cd156d09" Feb 24 02:57:24 crc kubenswrapper[4923]: I0224 02:57:24.490601 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mdqx"] Feb 24 02:57:24 crc kubenswrapper[4923]: I0224 02:57:24.665241 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f4g69" podUID="db365e80-350f-4a1f-955c-5d73c4704241" containerName="registry-server" probeResult="failure" output=< Feb 24 02:57:24 crc kubenswrapper[4923]: timeout: failed to connect service ":50051" within 1s Feb 24 02:57:24 crc kubenswrapper[4923]: > Feb 24 02:57:25 crc kubenswrapper[4923]: I0224 02:57:25.058929 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sh7x6" podUID="57508484-52d2-4deb-932e-ecfdf603f0f2" containerName="registry-server" probeResult="failure" output=< Feb 24 02:57:25 crc kubenswrapper[4923]: timeout: failed to connect service ":50051" within 1s Feb 24 02:57:25 crc kubenswrapper[4923]: > Feb 24 02:57:25 crc kubenswrapper[4923]: I0224 02:57:25.444887 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4mdqx" podUID="1f219509-d66e-49c2-bc60-9de19d02d5f0" containerName="registry-server" containerID="cri-o://ff45e7f9d06aea47a855ff43c062c89f708c84ca7cebceb10c88ad9e7aef0596" gracePeriod=2 Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.275256 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mdqx" Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.449437 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s8kd\" (UniqueName: \"kubernetes.io/projected/1f219509-d66e-49c2-bc60-9de19d02d5f0-kube-api-access-7s8kd\") pod \"1f219509-d66e-49c2-bc60-9de19d02d5f0\" (UID: \"1f219509-d66e-49c2-bc60-9de19d02d5f0\") " Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.449733 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f219509-d66e-49c2-bc60-9de19d02d5f0-utilities\") pod \"1f219509-d66e-49c2-bc60-9de19d02d5f0\" (UID: \"1f219509-d66e-49c2-bc60-9de19d02d5f0\") " Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.449801 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f219509-d66e-49c2-bc60-9de19d02d5f0-catalog-content\") pod \"1f219509-d66e-49c2-bc60-9de19d02d5f0\" (UID: \"1f219509-d66e-49c2-bc60-9de19d02d5f0\") " Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.450404 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f219509-d66e-49c2-bc60-9de19d02d5f0-utilities" (OuterVolumeSpecName: "utilities") pod "1f219509-d66e-49c2-bc60-9de19d02d5f0" (UID: "1f219509-d66e-49c2-bc60-9de19d02d5f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.452918 4923 generic.go:334] "Generic (PLEG): container finished" podID="3c62fecb-b531-4754-a747-902b75b2350d" containerID="243cc7fd0c396edcdffac2275a32713f681c13e48e5631e722100ede792e7cd2" exitCode=0 Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.452992 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5lw8" event={"ID":"3c62fecb-b531-4754-a747-902b75b2350d","Type":"ContainerDied","Data":"243cc7fd0c396edcdffac2275a32713f681c13e48e5631e722100ede792e7cd2"} Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.455595 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mdqx" event={"ID":"1f219509-d66e-49c2-bc60-9de19d02d5f0","Type":"ContainerDied","Data":"ff45e7f9d06aea47a855ff43c062c89f708c84ca7cebceb10c88ad9e7aef0596"} Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.455631 4923 generic.go:334] "Generic (PLEG): container finished" podID="1f219509-d66e-49c2-bc60-9de19d02d5f0" containerID="ff45e7f9d06aea47a855ff43c062c89f708c84ca7cebceb10c88ad9e7aef0596" exitCode=0 Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.455651 4923 scope.go:117] "RemoveContainer" containerID="ff45e7f9d06aea47a855ff43c062c89f708c84ca7cebceb10c88ad9e7aef0596" Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.455661 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4mdqx" event={"ID":"1f219509-d66e-49c2-bc60-9de19d02d5f0","Type":"ContainerDied","Data":"e14f7ae2786dd7c60933413105064352dab8f31f82fa962520c383f9602ba67a"} Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.455770 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4mdqx" Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.456847 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f219509-d66e-49c2-bc60-9de19d02d5f0-kube-api-access-7s8kd" (OuterVolumeSpecName: "kube-api-access-7s8kd") pod "1f219509-d66e-49c2-bc60-9de19d02d5f0" (UID: "1f219509-d66e-49c2-bc60-9de19d02d5f0"). InnerVolumeSpecName "kube-api-access-7s8kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.472779 4923 scope.go:117] "RemoveContainer" containerID="d1cb145ca9fdea29fb2a67fb6695bebc02b6c2ddc6ec1dc846fdc74c9eb377b3" Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.477239 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f219509-d66e-49c2-bc60-9de19d02d5f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f219509-d66e-49c2-bc60-9de19d02d5f0" (UID: "1f219509-d66e-49c2-bc60-9de19d02d5f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.488483 4923 scope.go:117] "RemoveContainer" containerID="4aeb4d5acdc7a309fe055d202f4084184416851c29510033e84b1e5ef640d0b6" Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.505549 4923 scope.go:117] "RemoveContainer" containerID="ff45e7f9d06aea47a855ff43c062c89f708c84ca7cebceb10c88ad9e7aef0596" Feb 24 02:57:26 crc kubenswrapper[4923]: E0224 02:57:26.505993 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff45e7f9d06aea47a855ff43c062c89f708c84ca7cebceb10c88ad9e7aef0596\": container with ID starting with ff45e7f9d06aea47a855ff43c062c89f708c84ca7cebceb10c88ad9e7aef0596 not found: ID does not exist" containerID="ff45e7f9d06aea47a855ff43c062c89f708c84ca7cebceb10c88ad9e7aef0596" Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.506029 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff45e7f9d06aea47a855ff43c062c89f708c84ca7cebceb10c88ad9e7aef0596"} err="failed to get container status \"ff45e7f9d06aea47a855ff43c062c89f708c84ca7cebceb10c88ad9e7aef0596\": rpc error: code = NotFound desc = could not find container \"ff45e7f9d06aea47a855ff43c062c89f708c84ca7cebceb10c88ad9e7aef0596\": container with ID starting with ff45e7f9d06aea47a855ff43c062c89f708c84ca7cebceb10c88ad9e7aef0596 not found: ID does not exist" Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.506055 4923 scope.go:117] "RemoveContainer" containerID="d1cb145ca9fdea29fb2a67fb6695bebc02b6c2ddc6ec1dc846fdc74c9eb377b3" Feb 24 02:57:26 crc kubenswrapper[4923]: E0224 02:57:26.506452 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1cb145ca9fdea29fb2a67fb6695bebc02b6c2ddc6ec1dc846fdc74c9eb377b3\": container with ID starting with d1cb145ca9fdea29fb2a67fb6695bebc02b6c2ddc6ec1dc846fdc74c9eb377b3 not found: ID does not exist" containerID="d1cb145ca9fdea29fb2a67fb6695bebc02b6c2ddc6ec1dc846fdc74c9eb377b3" Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.506473 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1cb145ca9fdea29fb2a67fb6695bebc02b6c2ddc6ec1dc846fdc74c9eb377b3"} err="failed to get container status \"d1cb145ca9fdea29fb2a67fb6695bebc02b6c2ddc6ec1dc846fdc74c9eb377b3\": rpc error: code = NotFound desc = could not find container \"d1cb145ca9fdea29fb2a67fb6695bebc02b6c2ddc6ec1dc846fdc74c9eb377b3\": container with ID starting with d1cb145ca9fdea29fb2a67fb6695bebc02b6c2ddc6ec1dc846fdc74c9eb377b3 not found: ID does not exist" Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.506488 4923 scope.go:117] "RemoveContainer" containerID="4aeb4d5acdc7a309fe055d202f4084184416851c29510033e84b1e5ef640d0b6" Feb 24 02:57:26 crc kubenswrapper[4923]: E0224 02:57:26.506819 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4aeb4d5acdc7a309fe055d202f4084184416851c29510033e84b1e5ef640d0b6\": container with ID starting with 4aeb4d5acdc7a309fe055d202f4084184416851c29510033e84b1e5ef640d0b6 not found: ID does not exist" containerID="4aeb4d5acdc7a309fe055d202f4084184416851c29510033e84b1e5ef640d0b6" Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.506839 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aeb4d5acdc7a309fe055d202f4084184416851c29510033e84b1e5ef640d0b6"} err="failed to get container status \"4aeb4d5acdc7a309fe055d202f4084184416851c29510033e84b1e5ef640d0b6\": rpc error: code = NotFound desc = could not find container \"4aeb4d5acdc7a309fe055d202f4084184416851c29510033e84b1e5ef640d0b6\": container with ID starting with 4aeb4d5acdc7a309fe055d202f4084184416851c29510033e84b1e5ef640d0b6 not found: ID does not exist" Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.551013 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f219509-d66e-49c2-bc60-9de19d02d5f0-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.551037 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s8kd\" (UniqueName: \"kubernetes.io/projected/1f219509-d66e-49c2-bc60-9de19d02d5f0-kube-api-access-7s8kd\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.551049 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f219509-d66e-49c2-bc60-9de19d02d5f0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.844396 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mdqx"] Feb 24 02:57:26 crc kubenswrapper[4923]: I0224 02:57:26.849621 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4mdqx"] Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.465391 4923 generic.go:334] "Generic (PLEG): container finished" podID="844ee205-faee-4873-978e-cf3d64cd8397" containerID="5405d746387389ab271470373f19cb3d2345e47bc73ebc10da44cad974910d7e" exitCode=0 Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.465453 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpqqp" event={"ID":"844ee205-faee-4873-978e-cf3d64cd8397","Type":"ContainerDied","Data":"5405d746387389ab271470373f19cb3d2345e47bc73ebc10da44cad974910d7e"} Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.719364 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f219509-d66e-49c2-bc60-9de19d02d5f0" path="/var/lib/kubelet/pods/1f219509-d66e-49c2-bc60-9de19d02d5f0/volumes" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.740590 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 24 02:57:27 crc kubenswrapper[4923]: E0224 02:57:27.741011 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f219509-d66e-49c2-bc60-9de19d02d5f0" containerName="extract-content" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.741030 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f219509-d66e-49c2-bc60-9de19d02d5f0" containerName="extract-content" Feb 24 02:57:27 crc kubenswrapper[4923]: E0224 02:57:27.741050 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f219509-d66e-49c2-bc60-9de19d02d5f0" containerName="registry-server" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.741059 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f219509-d66e-49c2-bc60-9de19d02d5f0" containerName="registry-server" Feb 24 02:57:27 crc kubenswrapper[4923]: E0224 02:57:27.741082 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f219509-d66e-49c2-bc60-9de19d02d5f0" containerName="extract-utilities" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.741090 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f219509-d66e-49c2-bc60-9de19d02d5f0" containerName="extract-utilities" Feb 24 02:57:27 crc kubenswrapper[4923]: E0224 02:57:27.741098 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27447569-e0b5-4db6-915e-fe063cf88e81" containerName="pruner" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.741105 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="27447569-e0b5-4db6-915e-fe063cf88e81" containerName="pruner" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.741333 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="27447569-e0b5-4db6-915e-fe063cf88e81" containerName="pruner" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.741350 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f219509-d66e-49c2-bc60-9de19d02d5f0" containerName="registry-server" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.741844 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.745212 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.746036 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.759173 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.766114 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af3de006-3a29-42c2-8640-d757aecec059-var-lock\") pod \"installer-9-crc\" (UID: \"af3de006-3a29-42c2-8640-d757aecec059\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.766192 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af3de006-3a29-42c2-8640-d757aecec059-kubelet-dir\") pod \"installer-9-crc\" (UID: \"af3de006-3a29-42c2-8640-d757aecec059\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.766226 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af3de006-3a29-42c2-8640-d757aecec059-kube-api-access\") pod \"installer-9-crc\" (UID: \"af3de006-3a29-42c2-8640-d757aecec059\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.867084 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af3de006-3a29-42c2-8640-d757aecec059-kube-api-access\") pod \"installer-9-crc\" (UID: \"af3de006-3a29-42c2-8640-d757aecec059\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.867144 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af3de006-3a29-42c2-8640-d757aecec059-var-lock\") pod \"installer-9-crc\" (UID: \"af3de006-3a29-42c2-8640-d757aecec059\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.867200 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af3de006-3a29-42c2-8640-d757aecec059-kubelet-dir\") pod \"installer-9-crc\" (UID: \"af3de006-3a29-42c2-8640-d757aecec059\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.867260 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af3de006-3a29-42c2-8640-d757aecec059-kubelet-dir\") pod \"installer-9-crc\" (UID: \"af3de006-3a29-42c2-8640-d757aecec059\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.867315 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af3de006-3a29-42c2-8640-d757aecec059-var-lock\") pod \"installer-9-crc\" (UID: \"af3de006-3a29-42c2-8640-d757aecec059\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 02:57:27 crc kubenswrapper[4923]: I0224 02:57:27.893257 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af3de006-3a29-42c2-8640-d757aecec059-kube-api-access\") pod \"installer-9-crc\" (UID: \"af3de006-3a29-42c2-8640-d757aecec059\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 02:57:28 crc kubenswrapper[4923]: I0224 02:57:28.056954 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 24 02:57:28 crc kubenswrapper[4923]: I0224 02:57:28.472125 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpqqp" event={"ID":"844ee205-faee-4873-978e-cf3d64cd8397","Type":"ContainerStarted","Data":"62b9ca0aba10dffeb71d443d89eb2f3f7ad39fb8b8cd8e0911807cf0823094f3"} Feb 24 02:57:28 crc kubenswrapper[4923]: I0224 02:57:28.475424 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5lw8" event={"ID":"3c62fecb-b531-4754-a747-902b75b2350d","Type":"ContainerStarted","Data":"41abb1a86e2ea27a699e3de57c2edc784a3ecc77c3bd6cce545ba4c1ecc6230c"} Feb 24 02:57:28 crc kubenswrapper[4923]: I0224 02:57:28.498870 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qpqqp" podStartSLOduration=2.464409124 podStartE2EDuration="48.49885472s" podCreationTimestamp="2026-02-24 02:56:40 +0000 UTC" firstStartedPulling="2026-02-24 02:56:41.84350606 +0000 UTC m=+125.860576873" lastFinishedPulling="2026-02-24 02:57:27.877951656 +0000 UTC m=+171.895022469" observedRunningTime="2026-02-24 02:57:28.494340922 +0000 UTC m=+172.511411735" watchObservedRunningTime="2026-02-24 02:57:28.49885472 +0000 UTC m=+172.515925533" Feb 24 02:57:28 crc kubenswrapper[4923]: I0224 02:57:28.519150 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k5lw8" podStartSLOduration=4.16819722 podStartE2EDuration="47.519131219s" podCreationTimestamp="2026-02-24 02:56:41 +0000 UTC" firstStartedPulling="2026-02-24 02:56:44.047003861 +0000 UTC m=+128.064074674" lastFinishedPulling="2026-02-24 02:57:27.39793785 +0000 UTC m=+171.415008673" observedRunningTime="2026-02-24 02:57:28.517948392 +0000 UTC m=+172.535019205" watchObservedRunningTime="2026-02-24 02:57:28.519131219 +0000 UTC m=+172.536202032" Feb 24 02:57:28 crc kubenswrapper[4923]: I0224 02:57:28.556482 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 24 02:57:29 crc kubenswrapper[4923]: I0224 02:57:29.482788 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"af3de006-3a29-42c2-8640-d757aecec059","Type":"ContainerStarted","Data":"ed03d7e4aa9d410a536dc09e7d3f6edf2024b923eb8cddc465377148c10cabeb"} Feb 24 02:57:29 crc kubenswrapper[4923]: I0224 02:57:29.483146 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"af3de006-3a29-42c2-8640-d757aecec059","Type":"ContainerStarted","Data":"9f15de5a0119071e97ebcc8c5e0fcfc5d70a943751643c1b91863b34a7740e10"} Feb 24 02:57:30 crc kubenswrapper[4923]: I0224 02:57:30.434615 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qpqqp" Feb 24 02:57:30 crc kubenswrapper[4923]: I0224 02:57:30.434685 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qpqqp" Feb 24 02:57:30 crc kubenswrapper[4923]: I0224 02:57:30.442168 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t4m2g" Feb 24 02:57:30 crc kubenswrapper[4923]: I0224 02:57:30.461672 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.461648868 podStartE2EDuration="3.461648868s" podCreationTimestamp="2026-02-24 02:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:57:29.507716452 +0000 UTC m=+173.524787275" watchObservedRunningTime="2026-02-24 02:57:30.461648868 +0000 UTC m=+174.478719681" Feb 24 02:57:30 crc kubenswrapper[4923]: I0224 02:57:30.478193 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qpqqp" Feb 24 02:57:30 crc kubenswrapper[4923]: I0224 02:57:30.484056 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t4m2g" Feb 24 02:57:32 crc kubenswrapper[4923]: I0224 02:57:32.226763 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k5lw8" Feb 24 02:57:32 crc kubenswrapper[4923]: I0224 02:57:32.227031 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k5lw8" Feb 24 02:57:32 crc kubenswrapper[4923]: I0224 02:57:32.266427 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k5lw8" Feb 24 02:57:32 crc kubenswrapper[4923]: I0224 02:57:32.560942 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k5lw8" Feb 24 02:57:33 crc kubenswrapper[4923]: I0224 02:57:33.662691 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f4g69" Feb 24 02:57:33 crc kubenswrapper[4923]: I0224 02:57:33.707082 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f4g69" Feb 24 02:57:34 crc kubenswrapper[4923]: I0224 02:57:34.081611 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sh7x6" Feb 24 02:57:34 crc kubenswrapper[4923]: I0224 02:57:34.139068 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sh7x6" Feb 24 02:57:36 crc kubenswrapper[4923]: I0224 02:57:36.088052 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sh7x6"] Feb 24 02:57:36 crc kubenswrapper[4923]: I0224 02:57:36.088594 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sh7x6" podUID="57508484-52d2-4deb-932e-ecfdf603f0f2" containerName="registry-server" containerID="cri-o://c7b1c39c136dd90f95c5c54fce670e79d9bbec840ad54cb456867e5ec003e16c" gracePeriod=2 Feb 24 02:57:37 crc kubenswrapper[4923]: I0224 02:57:37.440126 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7548dc4f95-bb92n"] Feb 24 02:57:37 crc kubenswrapper[4923]: I0224 02:57:37.440486 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" podUID="120499d4-e031-4ef7-81c9-099cd0d710e7" containerName="controller-manager" containerID="cri-o://d9c3e65d0b4bbda1cc260458c07a5bbc8802cbd43c285b5e8040d1bbe208867c" gracePeriod=30 Feb 24 02:57:37 crc kubenswrapper[4923]: I0224 02:57:37.526114 4923 generic.go:334] "Generic (PLEG): container finished" podID="57508484-52d2-4deb-932e-ecfdf603f0f2" containerID="c7b1c39c136dd90f95c5c54fce670e79d9bbec840ad54cb456867e5ec003e16c" exitCode=0 Feb 24 02:57:37 crc kubenswrapper[4923]: I0224 02:57:37.526159 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh7x6" event={"ID":"57508484-52d2-4deb-932e-ecfdf603f0f2","Type":"ContainerDied","Data":"c7b1c39c136dd90f95c5c54fce670e79d9bbec840ad54cb456867e5ec003e16c"} Feb 24 02:57:37 crc kubenswrapper[4923]: I0224 02:57:37.540857 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc"] Feb 24 02:57:37 crc kubenswrapper[4923]: I0224 02:57:37.541111 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" podUID="d534f39e-ad8c-4216-b373-6b3041fe84d5" containerName="route-controller-manager" containerID="cri-o://ebf777fe73f40c53eb29c662ba14224e50d11a153fac9c1e1b087a34b6fa83dd" gracePeriod=30 Feb 24 02:57:38 crc kubenswrapper[4923]: I0224 02:57:38.536153 4923 generic.go:334] "Generic (PLEG): container finished" podID="d534f39e-ad8c-4216-b373-6b3041fe84d5" containerID="ebf777fe73f40c53eb29c662ba14224e50d11a153fac9c1e1b087a34b6fa83dd" exitCode=0 Feb 24 02:57:38 crc kubenswrapper[4923]: I0224 02:57:38.536221 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" event={"ID":"d534f39e-ad8c-4216-b373-6b3041fe84d5","Type":"ContainerDied","Data":"ebf777fe73f40c53eb29c662ba14224e50d11a153fac9c1e1b087a34b6fa83dd"} Feb 24 02:57:38 crc kubenswrapper[4923]: I0224 02:57:38.538292 4923 generic.go:334] "Generic (PLEG): container finished" podID="120499d4-e031-4ef7-81c9-099cd0d710e7" containerID="d9c3e65d0b4bbda1cc260458c07a5bbc8802cbd43c285b5e8040d1bbe208867c" exitCode=0 Feb 24 02:57:38 crc kubenswrapper[4923]: I0224 02:57:38.538324 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" event={"ID":"120499d4-e031-4ef7-81c9-099cd0d710e7","Type":"ContainerDied","Data":"d9c3e65d0b4bbda1cc260458c07a5bbc8802cbd43c285b5e8040d1bbe208867c"} Feb 24 02:57:38 crc kubenswrapper[4923]: I0224 02:57:38.629956 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sh7x6" Feb 24 02:57:38 crc kubenswrapper[4923]: I0224 02:57:38.718187 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57508484-52d2-4deb-932e-ecfdf603f0f2-catalog-content\") pod \"57508484-52d2-4deb-932e-ecfdf603f0f2\" (UID: \"57508484-52d2-4deb-932e-ecfdf603f0f2\") " Feb 24 02:57:38 crc kubenswrapper[4923]: I0224 02:57:38.718244 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57508484-52d2-4deb-932e-ecfdf603f0f2-utilities\") pod \"57508484-52d2-4deb-932e-ecfdf603f0f2\" (UID: \"57508484-52d2-4deb-932e-ecfdf603f0f2\") " Feb 24 02:57:38 crc kubenswrapper[4923]: I0224 02:57:38.718350 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hkf7\" (UniqueName: \"kubernetes.io/projected/57508484-52d2-4deb-932e-ecfdf603f0f2-kube-api-access-4hkf7\") pod \"57508484-52d2-4deb-932e-ecfdf603f0f2\" (UID: \"57508484-52d2-4deb-932e-ecfdf603f0f2\") " Feb 24 02:57:38 crc kubenswrapper[4923]: I0224 02:57:38.719774 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57508484-52d2-4deb-932e-ecfdf603f0f2-utilities" (OuterVolumeSpecName: "utilities") pod "57508484-52d2-4deb-932e-ecfdf603f0f2" (UID: "57508484-52d2-4deb-932e-ecfdf603f0f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:57:38 crc kubenswrapper[4923]: I0224 02:57:38.724753 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57508484-52d2-4deb-932e-ecfdf603f0f2-kube-api-access-4hkf7" (OuterVolumeSpecName: "kube-api-access-4hkf7") pod "57508484-52d2-4deb-932e-ecfdf603f0f2" (UID: "57508484-52d2-4deb-932e-ecfdf603f0f2"). InnerVolumeSpecName "kube-api-access-4hkf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:57:38 crc kubenswrapper[4923]: I0224 02:57:38.819541 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57508484-52d2-4deb-932e-ecfdf603f0f2-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:38 crc kubenswrapper[4923]: I0224 02:57:38.819581 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hkf7\" (UniqueName: \"kubernetes.io/projected/57508484-52d2-4deb-932e-ecfdf603f0f2-kube-api-access-4hkf7\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:38 crc kubenswrapper[4923]: I0224 02:57:38.849509 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57508484-52d2-4deb-932e-ecfdf603f0f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57508484-52d2-4deb-932e-ecfdf603f0f2" (UID: "57508484-52d2-4deb-932e-ecfdf603f0f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:57:38 crc kubenswrapper[4923]: I0224 02:57:38.920269 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57508484-52d2-4deb-932e-ecfdf603f0f2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.231520 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.248854 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.322976 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d534f39e-ad8c-4216-b373-6b3041fe84d5-client-ca\") pod \"d534f39e-ad8c-4216-b373-6b3041fe84d5\" (UID: \"d534f39e-ad8c-4216-b373-6b3041fe84d5\") " Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.323332 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120499d4-e031-4ef7-81c9-099cd0d710e7-config\") pod \"120499d4-e031-4ef7-81c9-099cd0d710e7\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.323353 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/120499d4-e031-4ef7-81c9-099cd0d710e7-proxy-ca-bundles\") pod \"120499d4-e031-4ef7-81c9-099cd0d710e7\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.323376 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hzml\" (UniqueName: \"kubernetes.io/projected/120499d4-e031-4ef7-81c9-099cd0d710e7-kube-api-access-6hzml\") pod \"120499d4-e031-4ef7-81c9-099cd0d710e7\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.323394 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d534f39e-ad8c-4216-b373-6b3041fe84d5-serving-cert\") pod \"d534f39e-ad8c-4216-b373-6b3041fe84d5\" (UID: \"d534f39e-ad8c-4216-b373-6b3041fe84d5\") " Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.323409 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tthfp\" (UniqueName: \"kubernetes.io/projected/d534f39e-ad8c-4216-b373-6b3041fe84d5-kube-api-access-tthfp\") pod \"d534f39e-ad8c-4216-b373-6b3041fe84d5\" (UID: \"d534f39e-ad8c-4216-b373-6b3041fe84d5\") " Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.323444 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/120499d4-e031-4ef7-81c9-099cd0d710e7-serving-cert\") pod \"120499d4-e031-4ef7-81c9-099cd0d710e7\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.323461 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d534f39e-ad8c-4216-b373-6b3041fe84d5-config\") pod \"d534f39e-ad8c-4216-b373-6b3041fe84d5\" (UID: \"d534f39e-ad8c-4216-b373-6b3041fe84d5\") " Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.323490 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/120499d4-e031-4ef7-81c9-099cd0d710e7-client-ca\") pod \"120499d4-e031-4ef7-81c9-099cd0d710e7\" (UID: \"120499d4-e031-4ef7-81c9-099cd0d710e7\") " Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.323549 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d534f39e-ad8c-4216-b373-6b3041fe84d5-client-ca" (OuterVolumeSpecName: "client-ca") pod "d534f39e-ad8c-4216-b373-6b3041fe84d5" (UID: "d534f39e-ad8c-4216-b373-6b3041fe84d5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.323686 4923 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d534f39e-ad8c-4216-b373-6b3041fe84d5-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.324160 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/120499d4-e031-4ef7-81c9-099cd0d710e7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "120499d4-e031-4ef7-81c9-099cd0d710e7" (UID: "120499d4-e031-4ef7-81c9-099cd0d710e7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.324129 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/120499d4-e031-4ef7-81c9-099cd0d710e7-client-ca" (OuterVolumeSpecName: "client-ca") pod "120499d4-e031-4ef7-81c9-099cd0d710e7" (UID: "120499d4-e031-4ef7-81c9-099cd0d710e7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.324640 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d534f39e-ad8c-4216-b373-6b3041fe84d5-config" (OuterVolumeSpecName: "config") pod "d534f39e-ad8c-4216-b373-6b3041fe84d5" (UID: "d534f39e-ad8c-4216-b373-6b3041fe84d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.324714 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/120499d4-e031-4ef7-81c9-099cd0d710e7-config" (OuterVolumeSpecName: "config") pod "120499d4-e031-4ef7-81c9-099cd0d710e7" (UID: "120499d4-e031-4ef7-81c9-099cd0d710e7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.326912 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d534f39e-ad8c-4216-b373-6b3041fe84d5-kube-api-access-tthfp" (OuterVolumeSpecName: "kube-api-access-tthfp") pod "d534f39e-ad8c-4216-b373-6b3041fe84d5" (UID: "d534f39e-ad8c-4216-b373-6b3041fe84d5"). InnerVolumeSpecName "kube-api-access-tthfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.327014 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d534f39e-ad8c-4216-b373-6b3041fe84d5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d534f39e-ad8c-4216-b373-6b3041fe84d5" (UID: "d534f39e-ad8c-4216-b373-6b3041fe84d5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.327137 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/120499d4-e031-4ef7-81c9-099cd0d710e7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "120499d4-e031-4ef7-81c9-099cd0d710e7" (UID: "120499d4-e031-4ef7-81c9-099cd0d710e7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.327821 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/120499d4-e031-4ef7-81c9-099cd0d710e7-kube-api-access-6hzml" (OuterVolumeSpecName: "kube-api-access-6hzml") pod "120499d4-e031-4ef7-81c9-099cd0d710e7" (UID: "120499d4-e031-4ef7-81c9-099cd0d710e7"). InnerVolumeSpecName "kube-api-access-6hzml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.425063 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120499d4-e031-4ef7-81c9-099cd0d710e7-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.425118 4923 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/120499d4-e031-4ef7-81c9-099cd0d710e7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.425140 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hzml\" (UniqueName: \"kubernetes.io/projected/120499d4-e031-4ef7-81c9-099cd0d710e7-kube-api-access-6hzml\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.425157 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tthfp\" (UniqueName: \"kubernetes.io/projected/d534f39e-ad8c-4216-b373-6b3041fe84d5-kube-api-access-tthfp\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.425174 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d534f39e-ad8c-4216-b373-6b3041fe84d5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.425190 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/120499d4-e031-4ef7-81c9-099cd0d710e7-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.425208 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d534f39e-ad8c-4216-b373-6b3041fe84d5-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.425225 4923 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/120499d4-e031-4ef7-81c9-099cd0d710e7-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.544586 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" event={"ID":"d534f39e-ad8c-4216-b373-6b3041fe84d5","Type":"ContainerDied","Data":"dd14193cc9307ded56ace76f4f97a4ce14bcd19f18e9ee7d8400db2b983611eb"} Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.544611 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.544667 4923 scope.go:117] "RemoveContainer" containerID="ebf777fe73f40c53eb29c662ba14224e50d11a153fac9c1e1b087a34b6fa83dd" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.547318 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" event={"ID":"120499d4-e031-4ef7-81c9-099cd0d710e7","Type":"ContainerDied","Data":"89970fe047f070593bd807b39bd63470e8a1c114943a2f496748a4a96529f408"} Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.547445 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7548dc4f95-bb92n" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.552984 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sh7x6" event={"ID":"57508484-52d2-4deb-932e-ecfdf603f0f2","Type":"ContainerDied","Data":"e014c31910bd52420e98cd2b965bba4569db2ee6d7ddd88609f754c629fc35d1"} Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.553081 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sh7x6" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.599762 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7548dc4f95-bb92n"] Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.614497 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7548dc4f95-bb92n"] Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.615091 4923 scope.go:117] "RemoveContainer" containerID="d9c3e65d0b4bbda1cc260458c07a5bbc8802cbd43c285b5e8040d1bbe208867c" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.625125 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sh7x6"] Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.629086 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sh7x6"] Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.635309 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc"] Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.639402 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-546bb7bf67-wwvzc"] Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.719270 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="120499d4-e031-4ef7-81c9-099cd0d710e7" path="/var/lib/kubelet/pods/120499d4-e031-4ef7-81c9-099cd0d710e7/volumes" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.719836 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57508484-52d2-4deb-932e-ecfdf603f0f2" path="/var/lib/kubelet/pods/57508484-52d2-4deb-932e-ecfdf603f0f2/volumes" Feb 24 02:57:39 crc kubenswrapper[4923]: I0224 02:57:39.720779 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d534f39e-ad8c-4216-b373-6b3041fe84d5" path="/var/lib/kubelet/pods/d534f39e-ad8c-4216-b373-6b3041fe84d5/volumes" Feb 24 02:57:40 crc kubenswrapper[4923]: I0224 02:57:40.268449 4923 scope.go:117] "RemoveContainer" containerID="c7b1c39c136dd90f95c5c54fce670e79d9bbec840ad54cb456867e5ec003e16c" Feb 24 02:57:40 crc kubenswrapper[4923]: I0224 02:57:40.290919 4923 scope.go:117] "RemoveContainer" containerID="0721cc11bbb1d07209505099f753e610bbd35eeb802b5f6ab6ebc26b82aa8f7c" Feb 24 02:57:40 crc kubenswrapper[4923]: I0224 02:57:40.368232 4923 scope.go:117] "RemoveContainer" containerID="d89bf519ffd666a73f8b19bc4599f54e96e36bb35495d84b175d5a6193209328" Feb 24 02:57:40 crc kubenswrapper[4923]: I0224 02:57:40.491096 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qpqqp" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.569512 4923 generic.go:334] "Generic (PLEG): container finished" podID="cba8789a-c6f5-4fb3-93a9-ec12a41dba0b" containerID="df115f90cdcac64b02e871cc5141042b98e95ee51e4068cb8cb31798948eaea5" exitCode=0 Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.569584 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bccxn" event={"ID":"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b","Type":"ContainerDied","Data":"df115f90cdcac64b02e871cc5141042b98e95ee51e4068cb8cb31798948eaea5"} Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.571912 4923 generic.go:334] "Generic (PLEG): container finished" podID="770b43e6-56e5-4d30-9d35-f3ce4dcf3563" containerID="49725ad3efb631b8f96c1dee9f8cc2a44071497679a694d7b3a171ec4e3d6c97" exitCode=0 Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.571957 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77dzs" event={"ID":"770b43e6-56e5-4d30-9d35-f3ce4dcf3563","Type":"ContainerDied","Data":"49725ad3efb631b8f96c1dee9f8cc2a44071497679a694d7b3a171ec4e3d6c97"} Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.619698 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-577b9f9dc-z6zkf"] Feb 24 02:57:41 crc kubenswrapper[4923]: E0224 02:57:41.619939 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57508484-52d2-4deb-932e-ecfdf603f0f2" containerName="extract-content" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.619953 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="57508484-52d2-4deb-932e-ecfdf603f0f2" containerName="extract-content" Feb 24 02:57:41 crc kubenswrapper[4923]: E0224 02:57:41.619971 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d534f39e-ad8c-4216-b373-6b3041fe84d5" containerName="route-controller-manager" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.619978 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="d534f39e-ad8c-4216-b373-6b3041fe84d5" containerName="route-controller-manager" Feb 24 02:57:41 crc kubenswrapper[4923]: E0224 02:57:41.619985 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57508484-52d2-4deb-932e-ecfdf603f0f2" containerName="registry-server" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.619992 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="57508484-52d2-4deb-932e-ecfdf603f0f2" containerName="registry-server" Feb 24 02:57:41 crc kubenswrapper[4923]: E0224 02:57:41.620006 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="120499d4-e031-4ef7-81c9-099cd0d710e7" containerName="controller-manager" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.620013 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="120499d4-e031-4ef7-81c9-099cd0d710e7" containerName="controller-manager" Feb 24 02:57:41 crc kubenswrapper[4923]: E0224 02:57:41.620020 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57508484-52d2-4deb-932e-ecfdf603f0f2" containerName="extract-utilities" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.620026 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="57508484-52d2-4deb-932e-ecfdf603f0f2" containerName="extract-utilities" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.620122 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="d534f39e-ad8c-4216-b373-6b3041fe84d5" containerName="route-controller-manager" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.620141 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="57508484-52d2-4deb-932e-ecfdf603f0f2" containerName="registry-server" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.620149 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="120499d4-e031-4ef7-81c9-099cd0d710e7" containerName="controller-manager" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.620549 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.623146 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.623828 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg"] Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.625247 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.625361 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.625648 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.625951 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.626159 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.626391 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.627731 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.629282 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.629524 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.629704 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.630501 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.630728 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.634824 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg"] Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.634889 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-577b9f9dc-z6zkf"] Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.643830 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.652820 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede09459-cabd-40d6-ac3e-cf6048eec76d-config\") pod \"route-controller-manager-7c68697f9d-lm2fg\" (UID: \"ede09459-cabd-40d6-ac3e-cf6048eec76d\") " pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.653002 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa65acaa-189b-437c-8263-9c30e2e992bf-serving-cert\") pod \"controller-manager-577b9f9dc-z6zkf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.653039 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ede09459-cabd-40d6-ac3e-cf6048eec76d-serving-cert\") pod \"route-controller-manager-7c68697f9d-lm2fg\" (UID: \"ede09459-cabd-40d6-ac3e-cf6048eec76d\") " pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.653126 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ede09459-cabd-40d6-ac3e-cf6048eec76d-client-ca\") pod \"route-controller-manager-7c68697f9d-lm2fg\" (UID: \"ede09459-cabd-40d6-ac3e-cf6048eec76d\") " pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.653222 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa65acaa-189b-437c-8263-9c30e2e992bf-config\") pod \"controller-manager-577b9f9dc-z6zkf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.653289 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hct44\" (UniqueName: \"kubernetes.io/projected/ede09459-cabd-40d6-ac3e-cf6048eec76d-kube-api-access-hct44\") pod \"route-controller-manager-7c68697f9d-lm2fg\" (UID: \"ede09459-cabd-40d6-ac3e-cf6048eec76d\") " pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.653408 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa65acaa-189b-437c-8263-9c30e2e992bf-proxy-ca-bundles\") pod \"controller-manager-577b9f9dc-z6zkf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.653437 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqvrf\" (UniqueName: \"kubernetes.io/projected/aa65acaa-189b-437c-8263-9c30e2e992bf-kube-api-access-xqvrf\") pod \"controller-manager-577b9f9dc-z6zkf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.653518 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa65acaa-189b-437c-8263-9c30e2e992bf-client-ca\") pod \"controller-manager-577b9f9dc-z6zkf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.754123 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa65acaa-189b-437c-8263-9c30e2e992bf-config\") pod \"controller-manager-577b9f9dc-z6zkf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.754179 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hct44\" (UniqueName: \"kubernetes.io/projected/ede09459-cabd-40d6-ac3e-cf6048eec76d-kube-api-access-hct44\") pod \"route-controller-manager-7c68697f9d-lm2fg\" (UID: \"ede09459-cabd-40d6-ac3e-cf6048eec76d\") " pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.754234 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa65acaa-189b-437c-8263-9c30e2e992bf-proxy-ca-bundles\") pod \"controller-manager-577b9f9dc-z6zkf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.754251 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqvrf\" (UniqueName: \"kubernetes.io/projected/aa65acaa-189b-437c-8263-9c30e2e992bf-kube-api-access-xqvrf\") pod \"controller-manager-577b9f9dc-z6zkf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.754276 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa65acaa-189b-437c-8263-9c30e2e992bf-client-ca\") pod \"controller-manager-577b9f9dc-z6zkf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.754343 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede09459-cabd-40d6-ac3e-cf6048eec76d-config\") pod \"route-controller-manager-7c68697f9d-lm2fg\" (UID: \"ede09459-cabd-40d6-ac3e-cf6048eec76d\") " pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.754364 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa65acaa-189b-437c-8263-9c30e2e992bf-serving-cert\") pod \"controller-manager-577b9f9dc-z6zkf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.754381 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ede09459-cabd-40d6-ac3e-cf6048eec76d-serving-cert\") pod \"route-controller-manager-7c68697f9d-lm2fg\" (UID: \"ede09459-cabd-40d6-ac3e-cf6048eec76d\") " pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.754402 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ede09459-cabd-40d6-ac3e-cf6048eec76d-client-ca\") pod \"route-controller-manager-7c68697f9d-lm2fg\" (UID: \"ede09459-cabd-40d6-ac3e-cf6048eec76d\") " pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.755575 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa65acaa-189b-437c-8263-9c30e2e992bf-proxy-ca-bundles\") pod \"controller-manager-577b9f9dc-z6zkf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.755930 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ede09459-cabd-40d6-ac3e-cf6048eec76d-client-ca\") pod \"route-controller-manager-7c68697f9d-lm2fg\" (UID: \"ede09459-cabd-40d6-ac3e-cf6048eec76d\") " pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.755966 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede09459-cabd-40d6-ac3e-cf6048eec76d-config\") pod \"route-controller-manager-7c68697f9d-lm2fg\" (UID: \"ede09459-cabd-40d6-ac3e-cf6048eec76d\") " pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.756227 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa65acaa-189b-437c-8263-9c30e2e992bf-client-ca\") pod \"controller-manager-577b9f9dc-z6zkf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.756606 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa65acaa-189b-437c-8263-9c30e2e992bf-config\") pod \"controller-manager-577b9f9dc-z6zkf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.761905 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ede09459-cabd-40d6-ac3e-cf6048eec76d-serving-cert\") pod \"route-controller-manager-7c68697f9d-lm2fg\" (UID: \"ede09459-cabd-40d6-ac3e-cf6048eec76d\") " pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.773027 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa65acaa-189b-437c-8263-9c30e2e992bf-serving-cert\") pod \"controller-manager-577b9f9dc-z6zkf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.776029 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqvrf\" (UniqueName: \"kubernetes.io/projected/aa65acaa-189b-437c-8263-9c30e2e992bf-kube-api-access-xqvrf\") pod \"controller-manager-577b9f9dc-z6zkf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.777167 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hct44\" (UniqueName: \"kubernetes.io/projected/ede09459-cabd-40d6-ac3e-cf6048eec76d-kube-api-access-hct44\") pod \"route-controller-manager-7c68697f9d-lm2fg\" (UID: \"ede09459-cabd-40d6-ac3e-cf6048eec76d\") " pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.946117 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:41 crc kubenswrapper[4923]: I0224 02:57:41.952493 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" Feb 24 02:57:42 crc kubenswrapper[4923]: I0224 02:57:42.196866 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg"] Feb 24 02:57:42 crc kubenswrapper[4923]: W0224 02:57:42.205815 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podede09459_cabd_40d6_ac3e_cf6048eec76d.slice/crio-2a9cfc07d00c13e82fab8604c269144a3640948cb1d159ec8269fbe4f6826e1b WatchSource:0}: Error finding container 2a9cfc07d00c13e82fab8604c269144a3640948cb1d159ec8269fbe4f6826e1b: Status 404 returned error can't find the container with id 2a9cfc07d00c13e82fab8604c269144a3640948cb1d159ec8269fbe4f6826e1b Feb 24 02:57:42 crc kubenswrapper[4923]: I0224 02:57:42.267614 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-577b9f9dc-z6zkf"] Feb 24 02:57:42 crc kubenswrapper[4923]: I0224 02:57:42.579882 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bccxn" event={"ID":"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b","Type":"ContainerStarted","Data":"83ccfe1eb31c75cb23741252b2399b39b0ccd886b7d8ce8f2577194a79337dc7"} Feb 24 02:57:42 crc kubenswrapper[4923]: I0224 02:57:42.581514 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" event={"ID":"aa65acaa-189b-437c-8263-9c30e2e992bf","Type":"ContainerStarted","Data":"7f988689a6f0c1e01f3a2e699e6ad9327e7806aa388f5306171bc9a73fcbea12"} Feb 24 02:57:42 crc kubenswrapper[4923]: I0224 02:57:42.581544 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" event={"ID":"aa65acaa-189b-437c-8263-9c30e2e992bf","Type":"ContainerStarted","Data":"5319657e04d0e12fb20827769c2fb2587b5d04c3f2260cf9c351029544954abb"} Feb 24 02:57:42 crc kubenswrapper[4923]: I0224 02:57:42.581770 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:42 crc kubenswrapper[4923]: I0224 02:57:42.583923 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77dzs" event={"ID":"770b43e6-56e5-4d30-9d35-f3ce4dcf3563","Type":"ContainerStarted","Data":"f068db27eb3452166825ded300c80716f88666e27ea54007dc0800b98a9fab99"} Feb 24 02:57:42 crc kubenswrapper[4923]: I0224 02:57:42.585121 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" event={"ID":"ede09459-cabd-40d6-ac3e-cf6048eec76d","Type":"ContainerStarted","Data":"6a10a07168dc5d7279e2e2e12d4b95f23f07f9704d650a8d7dcef8a87b9bfea3"} Feb 24 02:57:42 crc kubenswrapper[4923]: I0224 02:57:42.585153 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" event={"ID":"ede09459-cabd-40d6-ac3e-cf6048eec76d","Type":"ContainerStarted","Data":"2a9cfc07d00c13e82fab8604c269144a3640948cb1d159ec8269fbe4f6826e1b"} Feb 24 02:57:42 crc kubenswrapper[4923]: I0224 02:57:42.585347 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" Feb 24 02:57:42 crc kubenswrapper[4923]: I0224 02:57:42.606730 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:42 crc kubenswrapper[4923]: I0224 02:57:42.611066 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bccxn" podStartSLOduration=2.435401529 podStartE2EDuration="1m2.611053291s" podCreationTimestamp="2026-02-24 02:56:40 +0000 UTC" firstStartedPulling="2026-02-24 02:56:41.841130217 +0000 UTC m=+125.858201030" lastFinishedPulling="2026-02-24 02:57:42.016781979 +0000 UTC m=+186.033852792" observedRunningTime="2026-02-24 02:57:42.60838264 +0000 UTC m=+186.625453453" watchObservedRunningTime="2026-02-24 02:57:42.611053291 +0000 UTC m=+186.628124104" Feb 24 02:57:42 crc kubenswrapper[4923]: I0224 02:57:42.629323 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" podStartSLOduration=5.629308308 podStartE2EDuration="5.629308308s" podCreationTimestamp="2026-02-24 02:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:57:42.626053659 +0000 UTC m=+186.643124472" watchObservedRunningTime="2026-02-24 02:57:42.629308308 +0000 UTC m=+186.646379111" Feb 24 02:57:42 crc kubenswrapper[4923]: I0224 02:57:42.654616 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-77dzs" podStartSLOduration=3.639896076 podStartE2EDuration="1m2.65459595s" podCreationTimestamp="2026-02-24 02:56:40 +0000 UTC" firstStartedPulling="2026-02-24 02:56:42.961624951 +0000 UTC m=+126.978695764" lastFinishedPulling="2026-02-24 02:57:41.976324825 +0000 UTC m=+185.993395638" observedRunningTime="2026-02-24 02:57:42.653020222 +0000 UTC m=+186.670091035" watchObservedRunningTime="2026-02-24 02:57:42.65459595 +0000 UTC m=+186.671666763" Feb 24 02:57:42 crc kubenswrapper[4923]: I0224 02:57:42.682365 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" podStartSLOduration=5.682345156 podStartE2EDuration="5.682345156s" podCreationTimestamp="2026-02-24 02:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:57:42.680928603 +0000 UTC m=+186.697999426" watchObservedRunningTime="2026-02-24 02:57:42.682345156 +0000 UTC m=+186.699415969" Feb 24 02:57:42 crc kubenswrapper[4923]: I0224 02:57:42.769119 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" Feb 24 02:57:47 crc kubenswrapper[4923]: I0224 02:57:47.856009 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 02:57:48 crc kubenswrapper[4923]: I0224 02:57:48.369506 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lfds7"] Feb 24 02:57:50 crc kubenswrapper[4923]: I0224 02:57:50.640238 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bccxn" Feb 24 02:57:50 crc kubenswrapper[4923]: I0224 02:57:50.640586 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bccxn" Feb 24 02:57:50 crc kubenswrapper[4923]: I0224 02:57:50.693167 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bccxn" Feb 24 02:57:50 crc kubenswrapper[4923]: I0224 02:57:50.875262 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-77dzs" Feb 24 02:57:50 crc kubenswrapper[4923]: I0224 02:57:50.875351 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-77dzs" Feb 24 02:57:50 crc kubenswrapper[4923]: I0224 02:57:50.917581 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-77dzs" Feb 24 02:57:51 crc kubenswrapper[4923]: I0224 02:57:51.665683 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bccxn" Feb 24 02:57:51 crc kubenswrapper[4923]: I0224 02:57:51.672638 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-77dzs" Feb 24 02:57:52 crc kubenswrapper[4923]: I0224 02:57:52.135346 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bccxn"] Feb 24 02:57:53 crc kubenswrapper[4923]: I0224 02:57:53.643065 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bccxn" podUID="cba8789a-c6f5-4fb3-93a9-ec12a41dba0b" containerName="registry-server" containerID="cri-o://83ccfe1eb31c75cb23741252b2399b39b0ccd886b7d8ce8f2577194a79337dc7" gracePeriod=2 Feb 24 02:57:53 crc kubenswrapper[4923]: I0224 02:57:53.931381 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-77dzs"] Feb 24 02:57:53 crc kubenswrapper[4923]: I0224 02:57:53.931892 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-77dzs" podUID="770b43e6-56e5-4d30-9d35-f3ce4dcf3563" containerName="registry-server" containerID="cri-o://f068db27eb3452166825ded300c80716f88666e27ea54007dc0800b98a9fab99" gracePeriod=2 Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.141706 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bccxn" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.226111 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hk45\" (UniqueName: \"kubernetes.io/projected/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b-kube-api-access-7hk45\") pod \"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b\" (UID: \"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b\") " Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.226202 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b-utilities\") pod \"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b\" (UID: \"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b\") " Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.226226 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b-catalog-content\") pod \"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b\" (UID: \"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b\") " Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.230780 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b-utilities" (OuterVolumeSpecName: "utilities") pod "cba8789a-c6f5-4fb3-93a9-ec12a41dba0b" (UID: "cba8789a-c6f5-4fb3-93a9-ec12a41dba0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.233012 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b-kube-api-access-7hk45" (OuterVolumeSpecName: "kube-api-access-7hk45") pod "cba8789a-c6f5-4fb3-93a9-ec12a41dba0b" (UID: "cba8789a-c6f5-4fb3-93a9-ec12a41dba0b"). InnerVolumeSpecName "kube-api-access-7hk45". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.277589 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cba8789a-c6f5-4fb3-93a9-ec12a41dba0b" (UID: "cba8789a-c6f5-4fb3-93a9-ec12a41dba0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.327704 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hk45\" (UniqueName: \"kubernetes.io/projected/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b-kube-api-access-7hk45\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.327739 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.327749 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.343107 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-77dzs" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.428410 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvwx8\" (UniqueName: \"kubernetes.io/projected/770b43e6-56e5-4d30-9d35-f3ce4dcf3563-kube-api-access-xvwx8\") pod \"770b43e6-56e5-4d30-9d35-f3ce4dcf3563\" (UID: \"770b43e6-56e5-4d30-9d35-f3ce4dcf3563\") " Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.428479 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/770b43e6-56e5-4d30-9d35-f3ce4dcf3563-catalog-content\") pod \"770b43e6-56e5-4d30-9d35-f3ce4dcf3563\" (UID: \"770b43e6-56e5-4d30-9d35-f3ce4dcf3563\") " Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.428517 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/770b43e6-56e5-4d30-9d35-f3ce4dcf3563-utilities\") pod \"770b43e6-56e5-4d30-9d35-f3ce4dcf3563\" (UID: \"770b43e6-56e5-4d30-9d35-f3ce4dcf3563\") " Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.429257 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/770b43e6-56e5-4d30-9d35-f3ce4dcf3563-utilities" (OuterVolumeSpecName: "utilities") pod "770b43e6-56e5-4d30-9d35-f3ce4dcf3563" (UID: "770b43e6-56e5-4d30-9d35-f3ce4dcf3563"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.433237 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/770b43e6-56e5-4d30-9d35-f3ce4dcf3563-kube-api-access-xvwx8" (OuterVolumeSpecName: "kube-api-access-xvwx8") pod "770b43e6-56e5-4d30-9d35-f3ce4dcf3563" (UID: "770b43e6-56e5-4d30-9d35-f3ce4dcf3563"). InnerVolumeSpecName "kube-api-access-xvwx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.474013 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/770b43e6-56e5-4d30-9d35-f3ce4dcf3563-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "770b43e6-56e5-4d30-9d35-f3ce4dcf3563" (UID: "770b43e6-56e5-4d30-9d35-f3ce4dcf3563"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.529581 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/770b43e6-56e5-4d30-9d35-f3ce4dcf3563-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.529616 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/770b43e6-56e5-4d30-9d35-f3ce4dcf3563-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.529627 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvwx8\" (UniqueName: \"kubernetes.io/projected/770b43e6-56e5-4d30-9d35-f3ce4dcf3563-kube-api-access-xvwx8\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.649369 4923 generic.go:334] "Generic (PLEG): container finished" podID="cba8789a-c6f5-4fb3-93a9-ec12a41dba0b" containerID="83ccfe1eb31c75cb23741252b2399b39b0ccd886b7d8ce8f2577194a79337dc7" exitCode=0 Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.649421 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bccxn" event={"ID":"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b","Type":"ContainerDied","Data":"83ccfe1eb31c75cb23741252b2399b39b0ccd886b7d8ce8f2577194a79337dc7"} Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.649449 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bccxn" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.649517 4923 scope.go:117] "RemoveContainer" containerID="83ccfe1eb31c75cb23741252b2399b39b0ccd886b7d8ce8f2577194a79337dc7" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.649504 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bccxn" event={"ID":"cba8789a-c6f5-4fb3-93a9-ec12a41dba0b","Type":"ContainerDied","Data":"59288b830f49e99efa75f16f90816c7554fed38536a7238319b1b58b931c2a92"} Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.652954 4923 generic.go:334] "Generic (PLEG): container finished" podID="770b43e6-56e5-4d30-9d35-f3ce4dcf3563" containerID="f068db27eb3452166825ded300c80716f88666e27ea54007dc0800b98a9fab99" exitCode=0 Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.653004 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77dzs" event={"ID":"770b43e6-56e5-4d30-9d35-f3ce4dcf3563","Type":"ContainerDied","Data":"f068db27eb3452166825ded300c80716f88666e27ea54007dc0800b98a9fab99"} Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.653051 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-77dzs" event={"ID":"770b43e6-56e5-4d30-9d35-f3ce4dcf3563","Type":"ContainerDied","Data":"c7df0189f731f3347b2034f7069e3614242fe0279a7d0be393be207fd09691fd"} Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.653057 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-77dzs" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.672473 4923 scope.go:117] "RemoveContainer" containerID="df115f90cdcac64b02e871cc5141042b98e95ee51e4068cb8cb31798948eaea5" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.678483 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bccxn"] Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.681879 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bccxn"] Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.706835 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-77dzs"] Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.713910 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-77dzs"] Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.714030 4923 scope.go:117] "RemoveContainer" containerID="2f828a483ea84d110ea0a95f1b2840b4e1f302a4e888535ad6aa592574f317e3" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.727867 4923 scope.go:117] "RemoveContainer" containerID="83ccfe1eb31c75cb23741252b2399b39b0ccd886b7d8ce8f2577194a79337dc7" Feb 24 02:57:54 crc kubenswrapper[4923]: E0224 02:57:54.728388 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83ccfe1eb31c75cb23741252b2399b39b0ccd886b7d8ce8f2577194a79337dc7\": container with ID starting with 83ccfe1eb31c75cb23741252b2399b39b0ccd886b7d8ce8f2577194a79337dc7 not found: ID does not exist" containerID="83ccfe1eb31c75cb23741252b2399b39b0ccd886b7d8ce8f2577194a79337dc7" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.728427 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83ccfe1eb31c75cb23741252b2399b39b0ccd886b7d8ce8f2577194a79337dc7"} err="failed to get container status \"83ccfe1eb31c75cb23741252b2399b39b0ccd886b7d8ce8f2577194a79337dc7\": rpc error: code = NotFound desc = could not find container \"83ccfe1eb31c75cb23741252b2399b39b0ccd886b7d8ce8f2577194a79337dc7\": container with ID starting with 83ccfe1eb31c75cb23741252b2399b39b0ccd886b7d8ce8f2577194a79337dc7 not found: ID does not exist" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.728470 4923 scope.go:117] "RemoveContainer" containerID="df115f90cdcac64b02e871cc5141042b98e95ee51e4068cb8cb31798948eaea5" Feb 24 02:57:54 crc kubenswrapper[4923]: E0224 02:57:54.728976 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df115f90cdcac64b02e871cc5141042b98e95ee51e4068cb8cb31798948eaea5\": container with ID starting with df115f90cdcac64b02e871cc5141042b98e95ee51e4068cb8cb31798948eaea5 not found: ID does not exist" containerID="df115f90cdcac64b02e871cc5141042b98e95ee51e4068cb8cb31798948eaea5" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.729014 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df115f90cdcac64b02e871cc5141042b98e95ee51e4068cb8cb31798948eaea5"} err="failed to get container status \"df115f90cdcac64b02e871cc5141042b98e95ee51e4068cb8cb31798948eaea5\": rpc error: code = NotFound desc = could not find container \"df115f90cdcac64b02e871cc5141042b98e95ee51e4068cb8cb31798948eaea5\": container with ID starting with df115f90cdcac64b02e871cc5141042b98e95ee51e4068cb8cb31798948eaea5 not found: ID does not exist" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.729041 4923 scope.go:117] "RemoveContainer" containerID="2f828a483ea84d110ea0a95f1b2840b4e1f302a4e888535ad6aa592574f317e3" Feb 24 02:57:54 crc kubenswrapper[4923]: E0224 02:57:54.729389 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f828a483ea84d110ea0a95f1b2840b4e1f302a4e888535ad6aa592574f317e3\": container with ID starting with 2f828a483ea84d110ea0a95f1b2840b4e1f302a4e888535ad6aa592574f317e3 not found: ID does not exist" containerID="2f828a483ea84d110ea0a95f1b2840b4e1f302a4e888535ad6aa592574f317e3" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.729427 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f828a483ea84d110ea0a95f1b2840b4e1f302a4e888535ad6aa592574f317e3"} err="failed to get container status \"2f828a483ea84d110ea0a95f1b2840b4e1f302a4e888535ad6aa592574f317e3\": rpc error: code = NotFound desc = could not find container \"2f828a483ea84d110ea0a95f1b2840b4e1f302a4e888535ad6aa592574f317e3\": container with ID starting with 2f828a483ea84d110ea0a95f1b2840b4e1f302a4e888535ad6aa592574f317e3 not found: ID does not exist" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.729448 4923 scope.go:117] "RemoveContainer" containerID="f068db27eb3452166825ded300c80716f88666e27ea54007dc0800b98a9fab99" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.744377 4923 scope.go:117] "RemoveContainer" containerID="49725ad3efb631b8f96c1dee9f8cc2a44071497679a694d7b3a171ec4e3d6c97" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.759796 4923 scope.go:117] "RemoveContainer" containerID="51c1d73e8b49455503822c0095614f4124f12ff27fbc205ba8cd060815f02ccb" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.773344 4923 scope.go:117] "RemoveContainer" containerID="f068db27eb3452166825ded300c80716f88666e27ea54007dc0800b98a9fab99" Feb 24 02:57:54 crc kubenswrapper[4923]: E0224 02:57:54.773708 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f068db27eb3452166825ded300c80716f88666e27ea54007dc0800b98a9fab99\": container with ID starting with f068db27eb3452166825ded300c80716f88666e27ea54007dc0800b98a9fab99 not found: ID does not exist" containerID="f068db27eb3452166825ded300c80716f88666e27ea54007dc0800b98a9fab99" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.773761 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f068db27eb3452166825ded300c80716f88666e27ea54007dc0800b98a9fab99"} err="failed to get container status \"f068db27eb3452166825ded300c80716f88666e27ea54007dc0800b98a9fab99\": rpc error: code = NotFound desc = could not find container \"f068db27eb3452166825ded300c80716f88666e27ea54007dc0800b98a9fab99\": container with ID starting with f068db27eb3452166825ded300c80716f88666e27ea54007dc0800b98a9fab99 not found: ID does not exist" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.773793 4923 scope.go:117] "RemoveContainer" containerID="49725ad3efb631b8f96c1dee9f8cc2a44071497679a694d7b3a171ec4e3d6c97" Feb 24 02:57:54 crc kubenswrapper[4923]: E0224 02:57:54.774188 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49725ad3efb631b8f96c1dee9f8cc2a44071497679a694d7b3a171ec4e3d6c97\": container with ID starting with 49725ad3efb631b8f96c1dee9f8cc2a44071497679a694d7b3a171ec4e3d6c97 not found: ID does not exist" containerID="49725ad3efb631b8f96c1dee9f8cc2a44071497679a694d7b3a171ec4e3d6c97" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.774216 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49725ad3efb631b8f96c1dee9f8cc2a44071497679a694d7b3a171ec4e3d6c97"} err="failed to get container status \"49725ad3efb631b8f96c1dee9f8cc2a44071497679a694d7b3a171ec4e3d6c97\": rpc error: code = NotFound desc = could not find container \"49725ad3efb631b8f96c1dee9f8cc2a44071497679a694d7b3a171ec4e3d6c97\": container with ID starting with 49725ad3efb631b8f96c1dee9f8cc2a44071497679a694d7b3a171ec4e3d6c97 not found: ID does not exist" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.774235 4923 scope.go:117] "RemoveContainer" containerID="51c1d73e8b49455503822c0095614f4124f12ff27fbc205ba8cd060815f02ccb" Feb 24 02:57:54 crc kubenswrapper[4923]: E0224 02:57:54.774515 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c1d73e8b49455503822c0095614f4124f12ff27fbc205ba8cd060815f02ccb\": container with ID starting with 51c1d73e8b49455503822c0095614f4124f12ff27fbc205ba8cd060815f02ccb not found: ID does not exist" containerID="51c1d73e8b49455503822c0095614f4124f12ff27fbc205ba8cd060815f02ccb" Feb 24 02:57:54 crc kubenswrapper[4923]: I0224 02:57:54.774542 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c1d73e8b49455503822c0095614f4124f12ff27fbc205ba8cd060815f02ccb"} err="failed to get container status \"51c1d73e8b49455503822c0095614f4124f12ff27fbc205ba8cd060815f02ccb\": rpc error: code = NotFound desc = could not find container \"51c1d73e8b49455503822c0095614f4124f12ff27fbc205ba8cd060815f02ccb\": container with ID starting with 51c1d73e8b49455503822c0095614f4124f12ff27fbc205ba8cd060815f02ccb not found: ID does not exist" Feb 24 02:57:55 crc kubenswrapper[4923]: I0224 02:57:55.725788 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="770b43e6-56e5-4d30-9d35-f3ce4dcf3563" path="/var/lib/kubelet/pods/770b43e6-56e5-4d30-9d35-f3ce4dcf3563/volumes" Feb 24 02:57:55 crc kubenswrapper[4923]: I0224 02:57:55.727686 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba8789a-c6f5-4fb3-93a9-ec12a41dba0b" path="/var/lib/kubelet/pods/cba8789a-c6f5-4fb3-93a9-ec12a41dba0b/volumes" Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.417998 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-577b9f9dc-z6zkf"] Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.418189 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" podUID="aa65acaa-189b-437c-8263-9c30e2e992bf" containerName="controller-manager" containerID="cri-o://7f988689a6f0c1e01f3a2e699e6ad9327e7806aa388f5306171bc9a73fcbea12" gracePeriod=30 Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.444205 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg"] Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.444459 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" podUID="ede09459-cabd-40d6-ac3e-cf6048eec76d" containerName="route-controller-manager" containerID="cri-o://6a10a07168dc5d7279e2e2e12d4b95f23f07f9704d650a8d7dcef8a87b9bfea3" gracePeriod=30 Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.673665 4923 generic.go:334] "Generic (PLEG): container finished" podID="aa65acaa-189b-437c-8263-9c30e2e992bf" containerID="7f988689a6f0c1e01f3a2e699e6ad9327e7806aa388f5306171bc9a73fcbea12" exitCode=0 Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.673794 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" event={"ID":"aa65acaa-189b-437c-8263-9c30e2e992bf","Type":"ContainerDied","Data":"7f988689a6f0c1e01f3a2e699e6ad9327e7806aa388f5306171bc9a73fcbea12"} Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.675367 4923 generic.go:334] "Generic (PLEG): container finished" podID="ede09459-cabd-40d6-ac3e-cf6048eec76d" containerID="6a10a07168dc5d7279e2e2e12d4b95f23f07f9704d650a8d7dcef8a87b9bfea3" exitCode=0 Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.675395 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" event={"ID":"ede09459-cabd-40d6-ac3e-cf6048eec76d","Type":"ContainerDied","Data":"6a10a07168dc5d7279e2e2e12d4b95f23f07f9704d650a8d7dcef8a87b9bfea3"} Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.887416 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.941706 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.973690 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa65acaa-189b-437c-8263-9c30e2e992bf-config\") pod \"aa65acaa-189b-437c-8263-9c30e2e992bf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.973746 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa65acaa-189b-437c-8263-9c30e2e992bf-client-ca\") pod \"aa65acaa-189b-437c-8263-9c30e2e992bf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.973770 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ede09459-cabd-40d6-ac3e-cf6048eec76d-client-ca\") pod \"ede09459-cabd-40d6-ac3e-cf6048eec76d\" (UID: \"ede09459-cabd-40d6-ac3e-cf6048eec76d\") " Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.973796 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqvrf\" (UniqueName: \"kubernetes.io/projected/aa65acaa-189b-437c-8263-9c30e2e992bf-kube-api-access-xqvrf\") pod \"aa65acaa-189b-437c-8263-9c30e2e992bf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.973821 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ede09459-cabd-40d6-ac3e-cf6048eec76d-serving-cert\") pod \"ede09459-cabd-40d6-ac3e-cf6048eec76d\" (UID: \"ede09459-cabd-40d6-ac3e-cf6048eec76d\") " Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.973855 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede09459-cabd-40d6-ac3e-cf6048eec76d-config\") pod \"ede09459-cabd-40d6-ac3e-cf6048eec76d\" (UID: \"ede09459-cabd-40d6-ac3e-cf6048eec76d\") " Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.973908 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa65acaa-189b-437c-8263-9c30e2e992bf-serving-cert\") pod \"aa65acaa-189b-437c-8263-9c30e2e992bf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.973938 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa65acaa-189b-437c-8263-9c30e2e992bf-proxy-ca-bundles\") pod \"aa65acaa-189b-437c-8263-9c30e2e992bf\" (UID: \"aa65acaa-189b-437c-8263-9c30e2e992bf\") " Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.973989 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hct44\" (UniqueName: \"kubernetes.io/projected/ede09459-cabd-40d6-ac3e-cf6048eec76d-kube-api-access-hct44\") pod \"ede09459-cabd-40d6-ac3e-cf6048eec76d\" (UID: \"ede09459-cabd-40d6-ac3e-cf6048eec76d\") " Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.974659 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa65acaa-189b-437c-8263-9c30e2e992bf-client-ca" (OuterVolumeSpecName: "client-ca") pod "aa65acaa-189b-437c-8263-9c30e2e992bf" (UID: "aa65acaa-189b-437c-8263-9c30e2e992bf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.974830 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa65acaa-189b-437c-8263-9c30e2e992bf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "aa65acaa-189b-437c-8263-9c30e2e992bf" (UID: "aa65acaa-189b-437c-8263-9c30e2e992bf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.975015 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ede09459-cabd-40d6-ac3e-cf6048eec76d-client-ca" (OuterVolumeSpecName: "client-ca") pod "ede09459-cabd-40d6-ac3e-cf6048eec76d" (UID: "ede09459-cabd-40d6-ac3e-cf6048eec76d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.975019 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ede09459-cabd-40d6-ac3e-cf6048eec76d-config" (OuterVolumeSpecName: "config") pod "ede09459-cabd-40d6-ac3e-cf6048eec76d" (UID: "ede09459-cabd-40d6-ac3e-cf6048eec76d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.975150 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa65acaa-189b-437c-8263-9c30e2e992bf-config" (OuterVolumeSpecName: "config") pod "aa65acaa-189b-437c-8263-9c30e2e992bf" (UID: "aa65acaa-189b-437c-8263-9c30e2e992bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.978920 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa65acaa-189b-437c-8263-9c30e2e992bf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "aa65acaa-189b-437c-8263-9c30e2e992bf" (UID: "aa65acaa-189b-437c-8263-9c30e2e992bf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.978983 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ede09459-cabd-40d6-ac3e-cf6048eec76d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ede09459-cabd-40d6-ac3e-cf6048eec76d" (UID: "ede09459-cabd-40d6-ac3e-cf6048eec76d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.979059 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa65acaa-189b-437c-8263-9c30e2e992bf-kube-api-access-xqvrf" (OuterVolumeSpecName: "kube-api-access-xqvrf") pod "aa65acaa-189b-437c-8263-9c30e2e992bf" (UID: "aa65acaa-189b-437c-8263-9c30e2e992bf"). InnerVolumeSpecName "kube-api-access-xqvrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:57:57 crc kubenswrapper[4923]: I0224 02:57:57.979116 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ede09459-cabd-40d6-ac3e-cf6048eec76d-kube-api-access-hct44" (OuterVolumeSpecName: "kube-api-access-hct44") pod "ede09459-cabd-40d6-ac3e-cf6048eec76d" (UID: "ede09459-cabd-40d6-ac3e-cf6048eec76d"). InnerVolumeSpecName "kube-api-access-hct44". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.075528 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hct44\" (UniqueName: \"kubernetes.io/projected/ede09459-cabd-40d6-ac3e-cf6048eec76d-kube-api-access-hct44\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.075570 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa65acaa-189b-437c-8263-9c30e2e992bf-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.075588 4923 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa65acaa-189b-437c-8263-9c30e2e992bf-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.075606 4923 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ede09459-cabd-40d6-ac3e-cf6048eec76d-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.075623 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqvrf\" (UniqueName: \"kubernetes.io/projected/aa65acaa-189b-437c-8263-9c30e2e992bf-kube-api-access-xqvrf\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.075640 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ede09459-cabd-40d6-ac3e-cf6048eec76d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.075656 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede09459-cabd-40d6-ac3e-cf6048eec76d-config\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.075672 4923 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa65acaa-189b-437c-8263-9c30e2e992bf-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.075727 4923 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa65acaa-189b-437c-8263-9c30e2e992bf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.616868 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5c445884d9-58bpg"] Feb 24 02:57:58 crc kubenswrapper[4923]: E0224 02:57:58.617153 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="770b43e6-56e5-4d30-9d35-f3ce4dcf3563" containerName="extract-content" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.617169 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="770b43e6-56e5-4d30-9d35-f3ce4dcf3563" containerName="extract-content" Feb 24 02:57:58 crc kubenswrapper[4923]: E0224 02:57:58.617182 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba8789a-c6f5-4fb3-93a9-ec12a41dba0b" containerName="registry-server" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.617190 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba8789a-c6f5-4fb3-93a9-ec12a41dba0b" containerName="registry-server" Feb 24 02:57:58 crc kubenswrapper[4923]: E0224 02:57:58.617205 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="770b43e6-56e5-4d30-9d35-f3ce4dcf3563" containerName="extract-utilities" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.617212 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="770b43e6-56e5-4d30-9d35-f3ce4dcf3563" containerName="extract-utilities" Feb 24 02:57:58 crc kubenswrapper[4923]: E0224 02:57:58.617225 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba8789a-c6f5-4fb3-93a9-ec12a41dba0b" containerName="extract-utilities" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.617231 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba8789a-c6f5-4fb3-93a9-ec12a41dba0b" containerName="extract-utilities" Feb 24 02:57:58 crc kubenswrapper[4923]: E0224 02:57:58.617240 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba8789a-c6f5-4fb3-93a9-ec12a41dba0b" containerName="extract-content" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.617247 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba8789a-c6f5-4fb3-93a9-ec12a41dba0b" containerName="extract-content" Feb 24 02:57:58 crc kubenswrapper[4923]: E0224 02:57:58.617255 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ede09459-cabd-40d6-ac3e-cf6048eec76d" containerName="route-controller-manager" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.617264 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="ede09459-cabd-40d6-ac3e-cf6048eec76d" containerName="route-controller-manager" Feb 24 02:57:58 crc kubenswrapper[4923]: E0224 02:57:58.617273 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa65acaa-189b-437c-8263-9c30e2e992bf" containerName="controller-manager" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.617279 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa65acaa-189b-437c-8263-9c30e2e992bf" containerName="controller-manager" Feb 24 02:57:58 crc kubenswrapper[4923]: E0224 02:57:58.617287 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="770b43e6-56e5-4d30-9d35-f3ce4dcf3563" containerName="registry-server" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.617324 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="770b43e6-56e5-4d30-9d35-f3ce4dcf3563" containerName="registry-server" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.617423 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="ede09459-cabd-40d6-ac3e-cf6048eec76d" containerName="route-controller-manager" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.617434 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba8789a-c6f5-4fb3-93a9-ec12a41dba0b" containerName="registry-server" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.617444 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="770b43e6-56e5-4d30-9d35-f3ce4dcf3563" containerName="registry-server" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.617455 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa65acaa-189b-437c-8263-9c30e2e992bf" containerName="controller-manager" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.617876 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.630801 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24"] Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.631756 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.638758 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c445884d9-58bpg"] Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.644664 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24"] Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.681135 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" event={"ID":"aa65acaa-189b-437c-8263-9c30e2e992bf","Type":"ContainerDied","Data":"5319657e04d0e12fb20827769c2fb2587b5d04c3f2260cf9c351029544954abb"} Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.681170 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-577b9f9dc-z6zkf" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.681203 4923 scope.go:117] "RemoveContainer" containerID="7f988689a6f0c1e01f3a2e699e6ad9327e7806aa388f5306171bc9a73fcbea12" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.682161 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ac80a55-e8c8-43c3-a8be-be0cc5165a76-serving-cert\") pod \"controller-manager-5c445884d9-58bpg\" (UID: \"3ac80a55-e8c8-43c3-a8be-be0cc5165a76\") " pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.682192 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe37c243-71e0-4a04-8972-146cebe7b10d-config\") pod \"route-controller-manager-5bfb979cff-kcx24\" (UID: \"fe37c243-71e0-4a04-8972-146cebe7b10d\") " pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.682210 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcx7p\" (UniqueName: \"kubernetes.io/projected/3ac80a55-e8c8-43c3-a8be-be0cc5165a76-kube-api-access-mcx7p\") pod \"controller-manager-5c445884d9-58bpg\" (UID: \"3ac80a55-e8c8-43c3-a8be-be0cc5165a76\") " pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.682235 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ac80a55-e8c8-43c3-a8be-be0cc5165a76-config\") pod \"controller-manager-5c445884d9-58bpg\" (UID: \"3ac80a55-e8c8-43c3-a8be-be0cc5165a76\") " pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.682252 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ac80a55-e8c8-43c3-a8be-be0cc5165a76-proxy-ca-bundles\") pod \"controller-manager-5c445884d9-58bpg\" (UID: \"3ac80a55-e8c8-43c3-a8be-be0cc5165a76\") " pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.682346 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m7qr\" (UniqueName: \"kubernetes.io/projected/fe37c243-71e0-4a04-8972-146cebe7b10d-kube-api-access-4m7qr\") pod \"route-controller-manager-5bfb979cff-kcx24\" (UID: \"fe37c243-71e0-4a04-8972-146cebe7b10d\") " pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.682399 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ac80a55-e8c8-43c3-a8be-be0cc5165a76-client-ca\") pod \"controller-manager-5c445884d9-58bpg\" (UID: \"3ac80a55-e8c8-43c3-a8be-be0cc5165a76\") " pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.682509 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" event={"ID":"ede09459-cabd-40d6-ac3e-cf6048eec76d","Type":"ContainerDied","Data":"2a9cfc07d00c13e82fab8604c269144a3640948cb1d159ec8269fbe4f6826e1b"} Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.682531 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe37c243-71e0-4a04-8972-146cebe7b10d-client-ca\") pod \"route-controller-manager-5bfb979cff-kcx24\" (UID: \"fe37c243-71e0-4a04-8972-146cebe7b10d\") " pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.682567 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe37c243-71e0-4a04-8972-146cebe7b10d-serving-cert\") pod \"route-controller-manager-5bfb979cff-kcx24\" (UID: \"fe37c243-71e0-4a04-8972-146cebe7b10d\") " pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.682582 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.700858 4923 scope.go:117] "RemoveContainer" containerID="6a10a07168dc5d7279e2e2e12d4b95f23f07f9704d650a8d7dcef8a87b9bfea3" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.716693 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-577b9f9dc-z6zkf"] Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.720471 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-577b9f9dc-z6zkf"] Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.727041 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg"] Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.729792 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c68697f9d-lm2fg"] Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.783340 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ac80a55-e8c8-43c3-a8be-be0cc5165a76-config\") pod \"controller-manager-5c445884d9-58bpg\" (UID: \"3ac80a55-e8c8-43c3-a8be-be0cc5165a76\") " pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.783381 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ac80a55-e8c8-43c3-a8be-be0cc5165a76-proxy-ca-bundles\") pod \"controller-manager-5c445884d9-58bpg\" (UID: \"3ac80a55-e8c8-43c3-a8be-be0cc5165a76\") " pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.783404 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m7qr\" (UniqueName: \"kubernetes.io/projected/fe37c243-71e0-4a04-8972-146cebe7b10d-kube-api-access-4m7qr\") pod \"route-controller-manager-5bfb979cff-kcx24\" (UID: \"fe37c243-71e0-4a04-8972-146cebe7b10d\") " pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.783422 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ac80a55-e8c8-43c3-a8be-be0cc5165a76-client-ca\") pod \"controller-manager-5c445884d9-58bpg\" (UID: \"3ac80a55-e8c8-43c3-a8be-be0cc5165a76\") " pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.783449 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe37c243-71e0-4a04-8972-146cebe7b10d-client-ca\") pod \"route-controller-manager-5bfb979cff-kcx24\" (UID: \"fe37c243-71e0-4a04-8972-146cebe7b10d\") " pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.783466 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe37c243-71e0-4a04-8972-146cebe7b10d-serving-cert\") pod \"route-controller-manager-5bfb979cff-kcx24\" (UID: \"fe37c243-71e0-4a04-8972-146cebe7b10d\") " pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.783530 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ac80a55-e8c8-43c3-a8be-be0cc5165a76-serving-cert\") pod \"controller-manager-5c445884d9-58bpg\" (UID: \"3ac80a55-e8c8-43c3-a8be-be0cc5165a76\") " pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.783546 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe37c243-71e0-4a04-8972-146cebe7b10d-config\") pod \"route-controller-manager-5bfb979cff-kcx24\" (UID: \"fe37c243-71e0-4a04-8972-146cebe7b10d\") " pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.783563 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcx7p\" (UniqueName: \"kubernetes.io/projected/3ac80a55-e8c8-43c3-a8be-be0cc5165a76-kube-api-access-mcx7p\") pod \"controller-manager-5c445884d9-58bpg\" (UID: \"3ac80a55-e8c8-43c3-a8be-be0cc5165a76\") " pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.785004 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ac80a55-e8c8-43c3-a8be-be0cc5165a76-config\") pod \"controller-manager-5c445884d9-58bpg\" (UID: \"3ac80a55-e8c8-43c3-a8be-be0cc5165a76\") " pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.785083 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ac80a55-e8c8-43c3-a8be-be0cc5165a76-client-ca\") pod \"controller-manager-5c445884d9-58bpg\" (UID: \"3ac80a55-e8c8-43c3-a8be-be0cc5165a76\") " pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.785110 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe37c243-71e0-4a04-8972-146cebe7b10d-config\") pod \"route-controller-manager-5bfb979cff-kcx24\" (UID: \"fe37c243-71e0-4a04-8972-146cebe7b10d\") " pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.785165 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ac80a55-e8c8-43c3-a8be-be0cc5165a76-proxy-ca-bundles\") pod \"controller-manager-5c445884d9-58bpg\" (UID: \"3ac80a55-e8c8-43c3-a8be-be0cc5165a76\") " pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.788473 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe37c243-71e0-4a04-8972-146cebe7b10d-client-ca\") pod \"route-controller-manager-5bfb979cff-kcx24\" (UID: \"fe37c243-71e0-4a04-8972-146cebe7b10d\") " pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.791880 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe37c243-71e0-4a04-8972-146cebe7b10d-serving-cert\") pod \"route-controller-manager-5bfb979cff-kcx24\" (UID: \"fe37c243-71e0-4a04-8972-146cebe7b10d\") " pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.791958 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ac80a55-e8c8-43c3-a8be-be0cc5165a76-serving-cert\") pod \"controller-manager-5c445884d9-58bpg\" (UID: \"3ac80a55-e8c8-43c3-a8be-be0cc5165a76\") " pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.801067 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcx7p\" (UniqueName: \"kubernetes.io/projected/3ac80a55-e8c8-43c3-a8be-be0cc5165a76-kube-api-access-mcx7p\") pod \"controller-manager-5c445884d9-58bpg\" (UID: \"3ac80a55-e8c8-43c3-a8be-be0cc5165a76\") " pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.806314 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m7qr\" (UniqueName: \"kubernetes.io/projected/fe37c243-71e0-4a04-8972-146cebe7b10d-kube-api-access-4m7qr\") pod \"route-controller-manager-5bfb979cff-kcx24\" (UID: \"fe37c243-71e0-4a04-8972-146cebe7b10d\") " pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.945287 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:58 crc kubenswrapper[4923]: I0224 02:57:58.959769 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" Feb 24 02:57:59 crc kubenswrapper[4923]: I0224 02:57:59.212946 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c445884d9-58bpg"] Feb 24 02:57:59 crc kubenswrapper[4923]: I0224 02:57:59.390345 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24"] Feb 24 02:57:59 crc kubenswrapper[4923]: W0224 02:57:59.400875 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe37c243_71e0_4a04_8972_146cebe7b10d.slice/crio-6e39c2633cf6159f3c8e940d3e95f209424a935bfca385b921f6cff83bfe17d6 WatchSource:0}: Error finding container 6e39c2633cf6159f3c8e940d3e95f209424a935bfca385b921f6cff83bfe17d6: Status 404 returned error can't find the container with id 6e39c2633cf6159f3c8e940d3e95f209424a935bfca385b921f6cff83bfe17d6 Feb 24 02:57:59 crc kubenswrapper[4923]: I0224 02:57:59.688757 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" event={"ID":"3ac80a55-e8c8-43c3-a8be-be0cc5165a76","Type":"ContainerStarted","Data":"3cfcc391c6ed3184cd47d9423d02c5d38482f1ee5003bcaed6f555258033bca6"} Feb 24 02:57:59 crc kubenswrapper[4923]: I0224 02:57:59.689134 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:59 crc kubenswrapper[4923]: I0224 02:57:59.689147 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" event={"ID":"3ac80a55-e8c8-43c3-a8be-be0cc5165a76","Type":"ContainerStarted","Data":"8f2b6ff46e4b277deea900e85fab8044e03c5eadf16b57440ba6f7b380508707"} Feb 24 02:57:59 crc kubenswrapper[4923]: I0224 02:57:59.691081 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" event={"ID":"fe37c243-71e0-4a04-8972-146cebe7b10d","Type":"ContainerStarted","Data":"a17152447a61c736391f78e506209a4f9c2e4f62f7dbb6612289883446485733"} Feb 24 02:57:59 crc kubenswrapper[4923]: I0224 02:57:59.691112 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" event={"ID":"fe37c243-71e0-4a04-8972-146cebe7b10d","Type":"ContainerStarted","Data":"6e39c2633cf6159f3c8e940d3e95f209424a935bfca385b921f6cff83bfe17d6"} Feb 24 02:57:59 crc kubenswrapper[4923]: I0224 02:57:59.691346 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" Feb 24 02:57:59 crc kubenswrapper[4923]: I0224 02:57:59.696277 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" Feb 24 02:57:59 crc kubenswrapper[4923]: I0224 02:57:59.704488 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c445884d9-58bpg" podStartSLOduration=2.704473231 podStartE2EDuration="2.704473231s" podCreationTimestamp="2026-02-24 02:57:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:57:59.703937068 +0000 UTC m=+203.721007881" watchObservedRunningTime="2026-02-24 02:57:59.704473231 +0000 UTC m=+203.721544044" Feb 24 02:57:59 crc kubenswrapper[4923]: I0224 02:57:59.719037 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa65acaa-189b-437c-8263-9c30e2e992bf" path="/var/lib/kubelet/pods/aa65acaa-189b-437c-8263-9c30e2e992bf/volumes" Feb 24 02:57:59 crc kubenswrapper[4923]: I0224 02:57:59.719807 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ede09459-cabd-40d6-ac3e-cf6048eec76d" path="/var/lib/kubelet/pods/ede09459-cabd-40d6-ac3e-cf6048eec76d/volumes" Feb 24 02:57:59 crc kubenswrapper[4923]: I0224 02:57:59.727186 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" podStartSLOduration=2.72717032 podStartE2EDuration="2.72717032s" podCreationTimestamp="2026-02-24 02:57:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:57:59.721790787 +0000 UTC m=+203.738861600" watchObservedRunningTime="2026-02-24 02:57:59.72717032 +0000 UTC m=+203.744241133" Feb 24 02:57:59 crc kubenswrapper[4923]: I0224 02:57:59.865830 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5bfb979cff-kcx24" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.424230 4923 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.425649 4923 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.425861 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.426074 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://5c45fb0cfe819bb8381e78f840d5fc12778d94056d6ef3440e24bd744b82534d" gracePeriod=15 Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.426143 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://289399f55c2f1b1d895f64d36a00d6664b31e5a87c5238eec012b73140d1c6e6" gracePeriod=15 Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.426161 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://b743a6362f655b21457e1b2fb383dd1ef637e82c5ec68b7c2768e5d371d2be7f" gracePeriod=15 Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.426171 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f3c44d8d664ba6ed57f842cc222128fb3ff294794a90e1162dbdf3395fa2b27c" gracePeriod=15 Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.426136 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://3e85b4aad496c83fd9337a52bfc8c683c38cffd96f8a9bd033dde2dfcb16c2ae" gracePeriod=15 Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.426999 4923 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 02:58:06 crc kubenswrapper[4923]: E0224 02:58:06.427209 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.427225 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 02:58:06 crc kubenswrapper[4923]: E0224 02:58:06.427239 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.427247 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 24 02:58:06 crc kubenswrapper[4923]: E0224 02:58:06.427259 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.428477 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 24 02:58:06 crc kubenswrapper[4923]: E0224 02:58:06.428495 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.428502 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 24 02:58:06 crc kubenswrapper[4923]: E0224 02:58:06.428514 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.428520 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 24 02:58:06 crc kubenswrapper[4923]: E0224 02:58:06.428529 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.428535 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 24 02:58:06 crc kubenswrapper[4923]: E0224 02:58:06.428546 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.428552 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 02:58:06 crc kubenswrapper[4923]: E0224 02:58:06.428559 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.428565 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 02:58:06 crc kubenswrapper[4923]: E0224 02:58:06.428574 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.428579 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.428695 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.428705 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.428714 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.428721 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.428729 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.428738 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.428747 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.428755 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.428764 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 24 02:58:06 crc kubenswrapper[4923]: E0224 02:58:06.428880 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.428889 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.469863 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.485585 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.485654 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.485680 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.485713 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.485734 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.485756 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.485802 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.485833 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.587554 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.587620 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.587647 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.587667 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.587701 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.587759 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.587754 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.587832 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.587866 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.587870 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.587943 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.587961 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.587977 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.588020 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.588054 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.588079 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.730974 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.732481 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.733013 4923 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b743a6362f655b21457e1b2fb383dd1ef637e82c5ec68b7c2768e5d371d2be7f" exitCode=0 Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.733039 4923 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3e85b4aad496c83fd9337a52bfc8c683c38cffd96f8a9bd033dde2dfcb16c2ae" exitCode=0 Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.733046 4923 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="289399f55c2f1b1d895f64d36a00d6664b31e5a87c5238eec012b73140d1c6e6" exitCode=0 Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.733056 4923 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f3c44d8d664ba6ed57f842cc222128fb3ff294794a90e1162dbdf3395fa2b27c" exitCode=2 Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.733119 4923 scope.go:117] "RemoveContainer" containerID="ff398e67eccf34b9e0bc1d34f2285b4a275efa099b3a0891887c9ca8e979d39c" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.734875 4923 generic.go:334] "Generic (PLEG): container finished" podID="af3de006-3a29-42c2-8640-d757aecec059" containerID="ed03d7e4aa9d410a536dc09e7d3f6edf2024b923eb8cddc465377148c10cabeb" exitCode=0 Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.734914 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"af3de006-3a29-42c2-8640-d757aecec059","Type":"ContainerDied","Data":"ed03d7e4aa9d410a536dc09e7d3f6edf2024b923eb8cddc465377148c10cabeb"} Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.735616 4923 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.735945 4923 status_manager.go:851] "Failed to get status for pod" podUID="af3de006-3a29-42c2-8640-d757aecec059" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.736360 4923 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:06 crc kubenswrapper[4923]: I0224 02:58:06.766611 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:58:06 crc kubenswrapper[4923]: E0224 02:58:06.783864 4923 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18970f60ca92dc5b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 02:58:06.783478875 +0000 UTC m=+210.800549708,LastTimestamp:2026-02-24 02:58:06.783478875 +0000 UTC m=+210.800549708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 02:58:07 crc kubenswrapper[4923]: I0224 02:58:07.728251 4923 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:07 crc kubenswrapper[4923]: I0224 02:58:07.729173 4923 status_manager.go:851] "Failed to get status for pod" podUID="af3de006-3a29-42c2-8640-d757aecec059" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:07 crc kubenswrapper[4923]: I0224 02:58:07.729559 4923 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:07 crc kubenswrapper[4923]: I0224 02:58:07.742818 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"efb5273d05e9551b981e371ab6357ed84f96e7b7bc171f3b5b190b64dc508ade"} Feb 24 02:58:07 crc kubenswrapper[4923]: I0224 02:58:07.742879 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c220356c56f7623b85f767ae2e91fa2a50b398109ebd1ed50d5714b6166309ec"} Feb 24 02:58:07 crc kubenswrapper[4923]: I0224 02:58:07.743682 4923 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:07 crc kubenswrapper[4923]: I0224 02:58:07.743985 4923 status_manager.go:851] "Failed to get status for pod" podUID="af3de006-3a29-42c2-8640-d757aecec059" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:07 crc kubenswrapper[4923]: I0224 02:58:07.746248 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 02:58:07 crc kubenswrapper[4923]: E0224 02:58:07.890141 4923 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:07 crc kubenswrapper[4923]: E0224 02:58:07.891158 4923 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:07 crc kubenswrapper[4923]: E0224 02:58:07.891682 4923 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:07 crc kubenswrapper[4923]: E0224 02:58:07.891897 4923 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:07 crc kubenswrapper[4923]: E0224 02:58:07.892101 4923 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:07 crc kubenswrapper[4923]: I0224 02:58:07.892126 4923 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 24 02:58:07 crc kubenswrapper[4923]: E0224 02:58:07.892326 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="200ms" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.066802 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.067471 4923 status_manager.go:851] "Failed to get status for pod" podUID="af3de006-3a29-42c2-8640-d757aecec059" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.068015 4923 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:08 crc kubenswrapper[4923]: E0224 02:58:08.093627 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="400ms" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.107279 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af3de006-3a29-42c2-8640-d757aecec059-kube-api-access\") pod \"af3de006-3a29-42c2-8640-d757aecec059\" (UID: \"af3de006-3a29-42c2-8640-d757aecec059\") " Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.107337 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af3de006-3a29-42c2-8640-d757aecec059-kubelet-dir\") pod \"af3de006-3a29-42c2-8640-d757aecec059\" (UID: \"af3de006-3a29-42c2-8640-d757aecec059\") " Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.107363 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af3de006-3a29-42c2-8640-d757aecec059-var-lock\") pod \"af3de006-3a29-42c2-8640-d757aecec059\" (UID: \"af3de006-3a29-42c2-8640-d757aecec059\") " Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.107540 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af3de006-3a29-42c2-8640-d757aecec059-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "af3de006-3a29-42c2-8640-d757aecec059" (UID: "af3de006-3a29-42c2-8640-d757aecec059"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.107667 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af3de006-3a29-42c2-8640-d757aecec059-var-lock" (OuterVolumeSpecName: "var-lock") pod "af3de006-3a29-42c2-8640-d757aecec059" (UID: "af3de006-3a29-42c2-8640-d757aecec059"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.111558 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3de006-3a29-42c2-8640-d757aecec059-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "af3de006-3a29-42c2-8640-d757aecec059" (UID: "af3de006-3a29-42c2-8640-d757aecec059"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:58:08 crc kubenswrapper[4923]: E0224 02:58:08.116281 4923 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18970f60ca92dc5b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 02:58:06.783478875 +0000 UTC m=+210.800549708,LastTimestamp:2026-02-24 02:58:06.783478875 +0000 UTC m=+210.800549708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.208963 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af3de006-3a29-42c2-8640-d757aecec059-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.209024 4923 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af3de006-3a29-42c2-8640-d757aecec059-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.209046 4923 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af3de006-3a29-42c2-8640-d757aecec059-var-lock\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:08 crc kubenswrapper[4923]: E0224 02:58:08.494915 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="800ms" Feb 24 02:58:08 crc kubenswrapper[4923]: E0224 02:58:08.594702 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:58:08Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:58:08Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:58:08Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T02:58:08Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:08 crc kubenswrapper[4923]: E0224 02:58:08.595470 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:08 crc kubenswrapper[4923]: E0224 02:58:08.595736 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:08 crc kubenswrapper[4923]: E0224 02:58:08.595934 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:08 crc kubenswrapper[4923]: E0224 02:58:08.596106 4923 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:08 crc kubenswrapper[4923]: E0224 02:58:08.596123 4923 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.756125 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"af3de006-3a29-42c2-8640-d757aecec059","Type":"ContainerDied","Data":"9f15de5a0119071e97ebcc8c5e0fcfc5d70a943751643c1b91863b34a7740e10"} Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.756164 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.756172 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f15de5a0119071e97ebcc8c5e0fcfc5d70a943751643c1b91863b34a7740e10" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.760671 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.761800 4923 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5c45fb0cfe819bb8381e78f840d5fc12778d94056d6ef3440e24bd744b82534d" exitCode=0 Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.800055 4923 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.800360 4923 status_manager.go:851] "Failed to get status for pod" podUID="af3de006-3a29-42c2-8640-d757aecec059" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.802329 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.802871 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.803271 4923 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.803607 4923 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.804172 4923 status_manager.go:851] "Failed to get status for pod" podUID="af3de006-3a29-42c2-8640-d757aecec059" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.919195 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.919349 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.919367 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.919415 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.919458 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.919580 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.919836 4923 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.919887 4923 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:08 crc kubenswrapper[4923]: I0224 02:58:08.919921 4923 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:09 crc kubenswrapper[4923]: E0224 02:58:09.296604 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="1.6s" Feb 24 02:58:09 crc kubenswrapper[4923]: I0224 02:58:09.724874 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 24 02:58:09 crc kubenswrapper[4923]: I0224 02:58:09.770709 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 02:58:09 crc kubenswrapper[4923]: I0224 02:58:09.771898 4923 scope.go:117] "RemoveContainer" containerID="b743a6362f655b21457e1b2fb383dd1ef637e82c5ec68b7c2768e5d371d2be7f" Feb 24 02:58:09 crc kubenswrapper[4923]: I0224 02:58:09.771966 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:09 crc kubenswrapper[4923]: I0224 02:58:09.772606 4923 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:09 crc kubenswrapper[4923]: I0224 02:58:09.772987 4923 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:09 crc kubenswrapper[4923]: I0224 02:58:09.774003 4923 status_manager.go:851] "Failed to get status for pod" podUID="af3de006-3a29-42c2-8640-d757aecec059" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:09 crc kubenswrapper[4923]: I0224 02:58:09.775105 4923 status_manager.go:851] "Failed to get status for pod" podUID="af3de006-3a29-42c2-8640-d757aecec059" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:09 crc kubenswrapper[4923]: I0224 02:58:09.775621 4923 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:09 crc kubenswrapper[4923]: I0224 02:58:09.775847 4923 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:09 crc kubenswrapper[4923]: I0224 02:58:09.792880 4923 scope.go:117] "RemoveContainer" containerID="3e85b4aad496c83fd9337a52bfc8c683c38cffd96f8a9bd033dde2dfcb16c2ae" Feb 24 02:58:09 crc kubenswrapper[4923]: I0224 02:58:09.806552 4923 scope.go:117] "RemoveContainer" containerID="289399f55c2f1b1d895f64d36a00d6664b31e5a87c5238eec012b73140d1c6e6" Feb 24 02:58:09 crc kubenswrapper[4923]: I0224 02:58:09.824085 4923 scope.go:117] "RemoveContainer" containerID="f3c44d8d664ba6ed57f842cc222128fb3ff294794a90e1162dbdf3395fa2b27c" Feb 24 02:58:09 crc kubenswrapper[4923]: I0224 02:58:09.838375 4923 scope.go:117] "RemoveContainer" containerID="5c45fb0cfe819bb8381e78f840d5fc12778d94056d6ef3440e24bd744b82534d" Feb 24 02:58:09 crc kubenswrapper[4923]: I0224 02:58:09.858382 4923 scope.go:117] "RemoveContainer" containerID="fdcd9cb862270aaa40cacf54fe5ab0e4e7f234fd5de4b04b6c0395e393a4df1a" Feb 24 02:58:10 crc kubenswrapper[4923]: E0224 02:58:10.898449 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="3.2s" Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.401515 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" podUID="fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" containerName="oauth-openshift" containerID="cri-o://a84cb03f106753b6f0af813959423454f8e1546e9d2293bb47485348ae4a345e" gracePeriod=15 Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.794213 4923 generic.go:334] "Generic (PLEG): container finished" podID="fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" containerID="a84cb03f106753b6f0af813959423454f8e1546e9d2293bb47485348ae4a345e" exitCode=0 Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.794246 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" event={"ID":"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29","Type":"ContainerDied","Data":"a84cb03f106753b6f0af813959423454f8e1546e9d2293bb47485348ae4a345e"} Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.868736 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.869345 4923 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.869837 4923 status_manager.go:851] "Failed to get status for pod" podUID="fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-lfds7\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.870368 4923 status_manager.go:851] "Failed to get status for pod" podUID="af3de006-3a29-42c2-8640-d757aecec059" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.987088 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-template-login\") pod \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.987217 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-template-provider-selection\") pod \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.987258 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-audit-policies\") pod \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.987333 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-service-ca\") pod \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.987379 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-cliconfig\") pod \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.987432 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-router-certs\") pod \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.987505 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-trusted-ca-bundle\") pod \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.987554 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-idp-0-file-data\") pod \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.987594 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd6ph\" (UniqueName: \"kubernetes.io/projected/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-kube-api-access-wd6ph\") pod \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.987628 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-session\") pod \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.987726 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-ocp-branding-template\") pod \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.987774 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-audit-dir\") pod \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.987806 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-serving-cert\") pod \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.987842 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-template-error\") pod \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\" (UID: \"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29\") " Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.991892 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" (UID: "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.992174 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" (UID: "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.992208 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" (UID: "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.992186 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" (UID: "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.993232 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" (UID: "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.993731 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" (UID: "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.994512 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" (UID: "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.995343 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" (UID: "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.996229 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-kube-api-access-wd6ph" (OuterVolumeSpecName: "kube-api-access-wd6ph") pod "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" (UID: "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29"). InnerVolumeSpecName "kube-api-access-wd6ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.997781 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" (UID: "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:58:13 crc kubenswrapper[4923]: I0224 02:58:13.999060 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" (UID: "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.006536 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" (UID: "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.009053 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" (UID: "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.014677 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" (UID: "fb43fea7-b7c2-4c3b-b0ea-9358e14cce29"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.089413 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.089486 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.089516 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.089544 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd6ph\" (UniqueName: \"kubernetes.io/projected/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-kube-api-access-wd6ph\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.089569 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.089595 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.089622 4923 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.089651 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.089674 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.089692 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.089711 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.089739 4923 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.089763 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.089786 4923 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 24 02:58:14 crc kubenswrapper[4923]: E0224 02:58:14.099511 4923 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="6.4s" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.802012 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" event={"ID":"fb43fea7-b7c2-4c3b-b0ea-9358e14cce29","Type":"ContainerDied","Data":"fa19150da061f2f9b1629fd59769ef9bd3e08c9b18a9bc2a445362335b970294"} Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.802389 4923 scope.go:117] "RemoveContainer" containerID="a84cb03f106753b6f0af813959423454f8e1546e9d2293bb47485348ae4a345e" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.802188 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.803243 4923 status_manager.go:851] "Failed to get status for pod" podUID="af3de006-3a29-42c2-8640-d757aecec059" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.803584 4923 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.804661 4923 status_manager.go:851] "Failed to get status for pod" podUID="fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-lfds7\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.817551 4923 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.818165 4923 status_manager.go:851] "Failed to get status for pod" podUID="fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-lfds7\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:14 crc kubenswrapper[4923]: I0224 02:58:14.818687 4923 status_manager.go:851] "Failed to get status for pod" podUID="af3de006-3a29-42c2-8640-d757aecec059" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:16 crc kubenswrapper[4923]: I0224 02:58:16.712702 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:16 crc kubenswrapper[4923]: I0224 02:58:16.714272 4923 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:16 crc kubenswrapper[4923]: I0224 02:58:16.715053 4923 status_manager.go:851] "Failed to get status for pod" podUID="fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-lfds7\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:16 crc kubenswrapper[4923]: I0224 02:58:16.715436 4923 status_manager.go:851] "Failed to get status for pod" podUID="af3de006-3a29-42c2-8640-d757aecec059" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:16 crc kubenswrapper[4923]: I0224 02:58:16.730777 4923 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6d6cf57e-b3b6-4b6b-9568-634a762d8a8b" Feb 24 02:58:16 crc kubenswrapper[4923]: I0224 02:58:16.730826 4923 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6d6cf57e-b3b6-4b6b-9568-634a762d8a8b" Feb 24 02:58:16 crc kubenswrapper[4923]: E0224 02:58:16.731449 4923 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:16 crc kubenswrapper[4923]: I0224 02:58:16.732099 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:16 crc kubenswrapper[4923]: I0224 02:58:16.816432 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a760c800b65975015252f434e2b7319587615aa435619092376b3a5cba733725"} Feb 24 02:58:17 crc kubenswrapper[4923]: I0224 02:58:17.718889 4923 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:17 crc kubenswrapper[4923]: I0224 02:58:17.719785 4923 status_manager.go:851] "Failed to get status for pod" podUID="fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-lfds7\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:17 crc kubenswrapper[4923]: I0224 02:58:17.720078 4923 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:17 crc kubenswrapper[4923]: I0224 02:58:17.720498 4923 status_manager.go:851] "Failed to get status for pod" podUID="af3de006-3a29-42c2-8640-d757aecec059" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:17 crc kubenswrapper[4923]: I0224 02:58:17.823937 4923 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="84be03984d3053a09bad8b9e244444afb0c49dbe725ecc1d52550cc94fc980fb" exitCode=0 Feb 24 02:58:17 crc kubenswrapper[4923]: I0224 02:58:17.823979 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"84be03984d3053a09bad8b9e244444afb0c49dbe725ecc1d52550cc94fc980fb"} Feb 24 02:58:17 crc kubenswrapper[4923]: I0224 02:58:17.824198 4923 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6d6cf57e-b3b6-4b6b-9568-634a762d8a8b" Feb 24 02:58:17 crc kubenswrapper[4923]: I0224 02:58:17.824227 4923 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6d6cf57e-b3b6-4b6b-9568-634a762d8a8b" Feb 24 02:58:17 crc kubenswrapper[4923]: I0224 02:58:17.824526 4923 status_manager.go:851] "Failed to get status for pod" podUID="fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" pod="openshift-authentication/oauth-openshift-558db77b4-lfds7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-lfds7\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:17 crc kubenswrapper[4923]: E0224 02:58:17.824653 4923 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:17 crc kubenswrapper[4923]: I0224 02:58:17.824974 4923 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:17 crc kubenswrapper[4923]: I0224 02:58:17.825178 4923 status_manager.go:851] "Failed to get status for pod" podUID="af3de006-3a29-42c2-8640-d757aecec059" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:17 crc kubenswrapper[4923]: I0224 02:58:17.825453 4923 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Feb 24 02:58:18 crc kubenswrapper[4923]: E0224 02:58:18.116993 4923 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18970f60ca92dc5b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 02:58:06.783478875 +0000 UTC m=+210.800549708,LastTimestamp:2026-02-24 02:58:06.783478875 +0000 UTC m=+210.800549708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 02:58:18 crc kubenswrapper[4923]: I0224 02:58:18.840474 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 24 02:58:18 crc kubenswrapper[4923]: I0224 02:58:18.841894 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 24 02:58:18 crc kubenswrapper[4923]: I0224 02:58:18.841936 4923 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="df468f367fa1345546fbfe1edec875c48e3fa9868dbe95b756a3984c8f1ee18f" exitCode=1 Feb 24 02:58:18 crc kubenswrapper[4923]: I0224 02:58:18.841992 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"df468f367fa1345546fbfe1edec875c48e3fa9868dbe95b756a3984c8f1ee18f"} Feb 24 02:58:18 crc kubenswrapper[4923]: I0224 02:58:18.842462 4923 scope.go:117] "RemoveContainer" containerID="df468f367fa1345546fbfe1edec875c48e3fa9868dbe95b756a3984c8f1ee18f" Feb 24 02:58:18 crc kubenswrapper[4923]: I0224 02:58:18.848090 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"58ba88965af38db46847f2248fa557a4440c1e06d7d6f24c4cde33e3a632169e"} Feb 24 02:58:18 crc kubenswrapper[4923]: I0224 02:58:18.848137 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"93df9507ae94584767fb1ad9024a1f365544e21e68d2c8f989be2bb71982ebfb"} Feb 24 02:58:18 crc kubenswrapper[4923]: I0224 02:58:18.848158 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ca4b38cdfea6806b397e5e1fae7538717311c76c466fb302f0c47ef6fc6405cb"} Feb 24 02:58:18 crc kubenswrapper[4923]: I0224 02:58:18.848173 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fc914e933ed75760dae00b42f7d58e99ef6d3c4c5a3563d9d6930c8718a919f2"} Feb 24 02:58:19 crc kubenswrapper[4923]: I0224 02:58:19.856405 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bbe621518224ace18f82b72907e3037c5807d1d46e9dfd141cec944988c7a51e"} Feb 24 02:58:19 crc kubenswrapper[4923]: I0224 02:58:19.856729 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:19 crc kubenswrapper[4923]: I0224 02:58:19.856593 4923 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6d6cf57e-b3b6-4b6b-9568-634a762d8a8b" Feb 24 02:58:19 crc kubenswrapper[4923]: I0224 02:58:19.856752 4923 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6d6cf57e-b3b6-4b6b-9568-634a762d8a8b" Feb 24 02:58:19 crc kubenswrapper[4923]: I0224 02:58:19.859541 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Feb 24 02:58:19 crc kubenswrapper[4923]: I0224 02:58:19.861020 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 24 02:58:19 crc kubenswrapper[4923]: I0224 02:58:19.861066 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"db41bc41aba1fb5c89a4325b90d7a903a953bbc9fbef03dd4a9df9c4ccb211f1"} Feb 24 02:58:21 crc kubenswrapper[4923]: I0224 02:58:21.286475 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:58:21 crc kubenswrapper[4923]: I0224 02:58:21.732747 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:21 crc kubenswrapper[4923]: I0224 02:58:21.732799 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:21 crc kubenswrapper[4923]: I0224 02:58:21.737433 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:22 crc kubenswrapper[4923]: I0224 02:58:22.294908 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:58:22 crc kubenswrapper[4923]: I0224 02:58:22.300710 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:58:24 crc kubenswrapper[4923]: I0224 02:58:24.866182 4923 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:24 crc kubenswrapper[4923]: I0224 02:58:24.894326 4923 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6d6cf57e-b3b6-4b6b-9568-634a762d8a8b" Feb 24 02:58:24 crc kubenswrapper[4923]: I0224 02:58:24.894374 4923 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6d6cf57e-b3b6-4b6b-9568-634a762d8a8b" Feb 24 02:58:24 crc kubenswrapper[4923]: I0224 02:58:24.902853 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:24 crc kubenswrapper[4923]: I0224 02:58:24.910830 4923 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b98845d7-cd4a-4851-882b-1dc9c3e65604" Feb 24 02:58:25 crc kubenswrapper[4923]: I0224 02:58:25.899802 4923 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6d6cf57e-b3b6-4b6b-9568-634a762d8a8b" Feb 24 02:58:25 crc kubenswrapper[4923]: I0224 02:58:25.899847 4923 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6d6cf57e-b3b6-4b6b-9568-634a762d8a8b" Feb 24 02:58:27 crc kubenswrapper[4923]: I0224 02:58:27.732392 4923 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b98845d7-cd4a-4851-882b-1dc9c3e65604" Feb 24 02:58:30 crc kubenswrapper[4923]: I0224 02:58:30.974342 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 24 02:58:31 crc kubenswrapper[4923]: I0224 02:58:31.182812 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 24 02:58:31 crc kubenswrapper[4923]: I0224 02:58:31.291430 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 02:58:31 crc kubenswrapper[4923]: I0224 02:58:31.313703 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 24 02:58:31 crc kubenswrapper[4923]: I0224 02:58:31.772993 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 24 02:58:31 crc kubenswrapper[4923]: I0224 02:58:31.807969 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 24 02:58:31 crc kubenswrapper[4923]: I0224 02:58:31.835680 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 24 02:58:32 crc kubenswrapper[4923]: I0224 02:58:32.161848 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 24 02:58:32 crc kubenswrapper[4923]: I0224 02:58:32.591804 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 24 02:58:32 crc kubenswrapper[4923]: I0224 02:58:32.801645 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 24 02:58:32 crc kubenswrapper[4923]: I0224 02:58:32.815978 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 24 02:58:32 crc kubenswrapper[4923]: I0224 02:58:32.855373 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 24 02:58:32 crc kubenswrapper[4923]: I0224 02:58:32.922083 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 24 02:58:33 crc kubenswrapper[4923]: I0224 02:58:33.232833 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 24 02:58:33 crc kubenswrapper[4923]: I0224 02:58:33.260690 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 24 02:58:33 crc kubenswrapper[4923]: I0224 02:58:33.307721 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 24 02:58:33 crc kubenswrapper[4923]: I0224 02:58:33.461969 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 24 02:58:33 crc kubenswrapper[4923]: I0224 02:58:33.637679 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 24 02:58:33 crc kubenswrapper[4923]: I0224 02:58:33.640902 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 24 02:58:33 crc kubenswrapper[4923]: I0224 02:58:33.682223 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 24 02:58:34 crc kubenswrapper[4923]: I0224 02:58:34.392826 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 24 02:58:34 crc kubenswrapper[4923]: I0224 02:58:34.450223 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 24 02:58:34 crc kubenswrapper[4923]: I0224 02:58:34.712101 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 24 02:58:34 crc kubenswrapper[4923]: I0224 02:58:34.929218 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 24 02:58:34 crc kubenswrapper[4923]: I0224 02:58:34.949871 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 24 02:58:35 crc kubenswrapper[4923]: I0224 02:58:35.410943 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 24 02:58:35 crc kubenswrapper[4923]: I0224 02:58:35.609846 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 24 02:58:36 crc kubenswrapper[4923]: I0224 02:58:36.366667 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 24 02:58:36 crc kubenswrapper[4923]: I0224 02:58:36.523270 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 02:58:36 crc kubenswrapper[4923]: I0224 02:58:36.774967 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 24 02:58:37 crc kubenswrapper[4923]: I0224 02:58:37.202425 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 24 02:58:37 crc kubenswrapper[4923]: I0224 02:58:37.263453 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 24 02:58:37 crc kubenswrapper[4923]: I0224 02:58:37.268364 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 24 02:58:37 crc kubenswrapper[4923]: I0224 02:58:37.307133 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 24 02:58:37 crc kubenswrapper[4923]: I0224 02:58:37.604995 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 24 02:58:37 crc kubenswrapper[4923]: I0224 02:58:37.628992 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 24 02:58:37 crc kubenswrapper[4923]: I0224 02:58:37.993764 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 24 02:58:38 crc kubenswrapper[4923]: I0224 02:58:38.181933 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 24 02:58:38 crc kubenswrapper[4923]: I0224 02:58:38.360036 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 24 02:58:38 crc kubenswrapper[4923]: I0224 02:58:38.380747 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 24 02:58:38 crc kubenswrapper[4923]: I0224 02:58:38.432913 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 24 02:58:38 crc kubenswrapper[4923]: I0224 02:58:38.568402 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 24 02:58:38 crc kubenswrapper[4923]: I0224 02:58:38.572847 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 24 02:58:39 crc kubenswrapper[4923]: I0224 02:58:39.057236 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 24 02:58:39 crc kubenswrapper[4923]: I0224 02:58:39.119580 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 24 02:58:39 crc kubenswrapper[4923]: I0224 02:58:39.232384 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 24 02:58:39 crc kubenswrapper[4923]: I0224 02:58:39.287859 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 24 02:58:39 crc kubenswrapper[4923]: I0224 02:58:39.310162 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 24 02:58:39 crc kubenswrapper[4923]: I0224 02:58:39.473847 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 24 02:58:39 crc kubenswrapper[4923]: I0224 02:58:39.525901 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 24 02:58:39 crc kubenswrapper[4923]: I0224 02:58:39.872656 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 24 02:58:39 crc kubenswrapper[4923]: I0224 02:58:39.913045 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 24 02:58:40 crc kubenswrapper[4923]: I0224 02:58:40.163728 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 24 02:58:40 crc kubenswrapper[4923]: I0224 02:58:40.220394 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 24 02:58:40 crc kubenswrapper[4923]: I0224 02:58:40.220726 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 24 02:58:40 crc kubenswrapper[4923]: I0224 02:58:40.302382 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 24 02:58:40 crc kubenswrapper[4923]: I0224 02:58:40.434343 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 24 02:58:40 crc kubenswrapper[4923]: I0224 02:58:40.511735 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 24 02:58:40 crc kubenswrapper[4923]: I0224 02:58:40.573516 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 24 02:58:40 crc kubenswrapper[4923]: I0224 02:58:40.646881 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 24 02:58:40 crc kubenswrapper[4923]: I0224 02:58:40.652467 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 24 02:58:40 crc kubenswrapper[4923]: I0224 02:58:40.663615 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 24 02:58:40 crc kubenswrapper[4923]: I0224 02:58:40.798629 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 24 02:58:40 crc kubenswrapper[4923]: I0224 02:58:40.918291 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 24 02:58:40 crc kubenswrapper[4923]: I0224 02:58:40.947836 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 24 02:58:40 crc kubenswrapper[4923]: I0224 02:58:40.998314 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 24 02:58:41 crc kubenswrapper[4923]: I0224 02:58:41.031260 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 24 02:58:41 crc kubenswrapper[4923]: I0224 02:58:41.036722 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 24 02:58:41 crc kubenswrapper[4923]: I0224 02:58:41.083359 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 24 02:58:41 crc kubenswrapper[4923]: I0224 02:58:41.210360 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 24 02:58:41 crc kubenswrapper[4923]: I0224 02:58:41.224068 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 24 02:58:41 crc kubenswrapper[4923]: I0224 02:58:41.355371 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 24 02:58:41 crc kubenswrapper[4923]: I0224 02:58:41.391334 4923 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 24 02:58:41 crc kubenswrapper[4923]: I0224 02:58:41.453437 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 24 02:58:41 crc kubenswrapper[4923]: I0224 02:58:41.503747 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 24 02:58:41 crc kubenswrapper[4923]: I0224 02:58:41.700199 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 24 02:58:41 crc kubenswrapper[4923]: I0224 02:58:41.977838 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 24 02:58:41 crc kubenswrapper[4923]: I0224 02:58:41.991699 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 24 02:58:42 crc kubenswrapper[4923]: I0224 02:58:42.028290 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 24 02:58:42 crc kubenswrapper[4923]: I0224 02:58:42.084120 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 24 02:58:42 crc kubenswrapper[4923]: I0224 02:58:42.089806 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 24 02:58:42 crc kubenswrapper[4923]: I0224 02:58:42.133342 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 24 02:58:42 crc kubenswrapper[4923]: I0224 02:58:42.139605 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 24 02:58:42 crc kubenswrapper[4923]: I0224 02:58:42.263356 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 24 02:58:42 crc kubenswrapper[4923]: I0224 02:58:42.291457 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 02:58:42 crc kubenswrapper[4923]: I0224 02:58:42.328796 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 24 02:58:42 crc kubenswrapper[4923]: I0224 02:58:42.375627 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 02:58:42 crc kubenswrapper[4923]: I0224 02:58:42.382836 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 24 02:58:42 crc kubenswrapper[4923]: I0224 02:58:42.461247 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 24 02:58:42 crc kubenswrapper[4923]: I0224 02:58:42.511058 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 24 02:58:42 crc kubenswrapper[4923]: I0224 02:58:42.539350 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 24 02:58:42 crc kubenswrapper[4923]: I0224 02:58:42.548170 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 24 02:58:42 crc kubenswrapper[4923]: I0224 02:58:42.576847 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 02:58:42 crc kubenswrapper[4923]: I0224 02:58:42.652181 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 24 02:58:42 crc kubenswrapper[4923]: I0224 02:58:42.700556 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 24 02:58:42 crc kubenswrapper[4923]: I0224 02:58:42.980907 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.041759 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.071708 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.122355 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.210374 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.308369 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.376467 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.382107 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.404748 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.442226 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.446065 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.474235 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.503093 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.566628 4923 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.697931 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.706890 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.721155 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.739785 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.740792 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.778164 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.816409 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.824466 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.828994 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.852813 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.887053 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.890170 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 24 02:58:43 crc kubenswrapper[4923]: I0224 02:58:43.971769 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 24 02:58:44 crc kubenswrapper[4923]: I0224 02:58:44.009356 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 24 02:58:44 crc kubenswrapper[4923]: I0224 02:58:44.015123 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 24 02:58:44 crc kubenswrapper[4923]: I0224 02:58:44.120424 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 02:58:44 crc kubenswrapper[4923]: I0224 02:58:44.148642 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 24 02:58:44 crc kubenswrapper[4923]: I0224 02:58:44.182186 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 02:58:44 crc kubenswrapper[4923]: I0224 02:58:44.210776 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 24 02:58:44 crc kubenswrapper[4923]: I0224 02:58:44.273763 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 24 02:58:44 crc kubenswrapper[4923]: I0224 02:58:44.353531 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 24 02:58:44 crc kubenswrapper[4923]: I0224 02:58:44.591693 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 24 02:58:44 crc kubenswrapper[4923]: I0224 02:58:44.693376 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 24 02:58:44 crc kubenswrapper[4923]: I0224 02:58:44.723267 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 02:58:44 crc kubenswrapper[4923]: I0224 02:58:44.725973 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 24 02:58:44 crc kubenswrapper[4923]: I0224 02:58:44.812487 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 24 02:58:44 crc kubenswrapper[4923]: I0224 02:58:44.864584 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 02:58:44 crc kubenswrapper[4923]: I0224 02:58:44.995693 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 24 02:58:45 crc kubenswrapper[4923]: I0224 02:58:45.065330 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 24 02:58:45 crc kubenswrapper[4923]: I0224 02:58:45.164974 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 24 02:58:45 crc kubenswrapper[4923]: I0224 02:58:45.168700 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 24 02:58:45 crc kubenswrapper[4923]: I0224 02:58:45.258572 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 24 02:58:45 crc kubenswrapper[4923]: I0224 02:58:45.258713 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 24 02:58:45 crc kubenswrapper[4923]: I0224 02:58:45.334792 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 24 02:58:45 crc kubenswrapper[4923]: I0224 02:58:45.340717 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 24 02:58:45 crc kubenswrapper[4923]: I0224 02:58:45.484829 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 24 02:58:45 crc kubenswrapper[4923]: I0224 02:58:45.502809 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 24 02:58:45 crc kubenswrapper[4923]: I0224 02:58:45.673712 4923 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 24 02:58:45 crc kubenswrapper[4923]: I0224 02:58:45.735845 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 24 02:58:45 crc kubenswrapper[4923]: I0224 02:58:45.743052 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 24 02:58:45 crc kubenswrapper[4923]: I0224 02:58:45.753159 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 24 02:58:45 crc kubenswrapper[4923]: I0224 02:58:45.764966 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 24 02:58:45 crc kubenswrapper[4923]: I0224 02:58:45.866008 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 24 02:58:45 crc kubenswrapper[4923]: I0224 02:58:45.903609 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 24 02:58:45 crc kubenswrapper[4923]: I0224 02:58:45.925434 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 24 02:58:46 crc kubenswrapper[4923]: I0224 02:58:46.086376 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 24 02:58:46 crc kubenswrapper[4923]: I0224 02:58:46.096892 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 24 02:58:46 crc kubenswrapper[4923]: I0224 02:58:46.113241 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 24 02:58:46 crc kubenswrapper[4923]: I0224 02:58:46.122444 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 24 02:58:46 crc kubenswrapper[4923]: I0224 02:58:46.185932 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 24 02:58:46 crc kubenswrapper[4923]: I0224 02:58:46.201775 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 24 02:58:46 crc kubenswrapper[4923]: I0224 02:58:46.354822 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 24 02:58:46 crc kubenswrapper[4923]: I0224 02:58:46.534004 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 24 02:58:46 crc kubenswrapper[4923]: I0224 02:58:46.582484 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 24 02:58:46 crc kubenswrapper[4923]: I0224 02:58:46.638187 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 24 02:58:46 crc kubenswrapper[4923]: I0224 02:58:46.826003 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 24 02:58:46 crc kubenswrapper[4923]: I0224 02:58:46.881587 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 24 02:58:46 crc kubenswrapper[4923]: I0224 02:58:46.918439 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 24 02:58:46 crc kubenswrapper[4923]: I0224 02:58:46.961764 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 24 02:58:47 crc kubenswrapper[4923]: I0224 02:58:47.107355 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 24 02:58:47 crc kubenswrapper[4923]: I0224 02:58:47.109197 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 24 02:58:47 crc kubenswrapper[4923]: I0224 02:58:47.219467 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 24 02:58:47 crc kubenswrapper[4923]: I0224 02:58:47.268683 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 24 02:58:47 crc kubenswrapper[4923]: I0224 02:58:47.369545 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 24 02:58:47 crc kubenswrapper[4923]: I0224 02:58:47.395659 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 24 02:58:47 crc kubenswrapper[4923]: I0224 02:58:47.412274 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 24 02:58:47 crc kubenswrapper[4923]: I0224 02:58:47.464631 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 24 02:58:47 crc kubenswrapper[4923]: I0224 02:58:47.503389 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 24 02:58:47 crc kubenswrapper[4923]: I0224 02:58:47.504577 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 24 02:58:47 crc kubenswrapper[4923]: I0224 02:58:47.557819 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 24 02:58:47 crc kubenswrapper[4923]: I0224 02:58:47.594892 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 24 02:58:47 crc kubenswrapper[4923]: I0224 02:58:47.680013 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 24 02:58:47 crc kubenswrapper[4923]: I0224 02:58:47.802717 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 24 02:58:47 crc kubenswrapper[4923]: I0224 02:58:47.832135 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 24 02:58:47 crc kubenswrapper[4923]: I0224 02:58:47.948591 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 24 02:58:47 crc kubenswrapper[4923]: I0224 02:58:47.998346 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.124151 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.206922 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.236804 4923 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.241422 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.241607 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=42.241588979 podStartE2EDuration="42.241588979s" podCreationTimestamp="2026-02-24 02:58:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:58:24.561928693 +0000 UTC m=+228.578999516" watchObservedRunningTime="2026-02-24 02:58:48.241588979 +0000 UTC m=+252.258659802" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.242370 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-lfds7"] Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.242423 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-649d76d5b4-5njkq"] Feb 24 02:58:48 crc kubenswrapper[4923]: E0224 02:58:48.242667 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3de006-3a29-42c2-8640-d757aecec059" containerName="installer" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.242689 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3de006-3a29-42c2-8640-d757aecec059" containerName="installer" Feb 24 02:58:48 crc kubenswrapper[4923]: E0224 02:58:48.242732 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" containerName="oauth-openshift" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.242743 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" containerName="oauth-openshift" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.242928 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" containerName="oauth-openshift" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.242947 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="af3de006-3a29-42c2-8640-d757aecec059" containerName="installer" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.243372 4923 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6d6cf57e-b3b6-4b6b-9568-634a762d8a8b" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.243427 4923 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6d6cf57e-b3b6-4b6b-9568-634a762d8a8b" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.243696 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.248021 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.248139 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.248376 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.248520 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.248560 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.248773 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.248986 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.249225 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.249399 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.249560 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.249702 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.249731 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.250772 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.261360 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.262455 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.270139 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.284352 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.284289401 podStartE2EDuration="24.284289401s" podCreationTimestamp="2026-02-24 02:58:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:58:48.282332242 +0000 UTC m=+252.299403065" watchObservedRunningTime="2026-02-24 02:58:48.284289401 +0000 UTC m=+252.301360214" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.298197 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-user-template-error\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.298271 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ace1e79f-8e98-4a25-9cbb-1059d87029ff-audit-dir\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.298354 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-session\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.298420 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-serving-cert\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.298456 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.298486 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.298528 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.298553 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.298592 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dd4d\" (UniqueName: \"kubernetes.io/projected/ace1e79f-8e98-4a25-9cbb-1059d87029ff-kube-api-access-4dd4d\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.298626 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-user-template-login\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.298661 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ace1e79f-8e98-4a25-9cbb-1059d87029ff-audit-policies\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.298694 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-service-ca\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.298733 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-cliconfig\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.298778 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-router-certs\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.304808 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.370990 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.399264 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-user-template-error\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.399331 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ace1e79f-8e98-4a25-9cbb-1059d87029ff-audit-dir\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.399358 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-session\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.399379 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-serving-cert\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.399399 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.399419 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.399440 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.399458 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.399453 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ace1e79f-8e98-4a25-9cbb-1059d87029ff-audit-dir\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.399474 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dd4d\" (UniqueName: \"kubernetes.io/projected/ace1e79f-8e98-4a25-9cbb-1059d87029ff-kube-api-access-4dd4d\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.399561 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-user-template-login\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.399600 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ace1e79f-8e98-4a25-9cbb-1059d87029ff-audit-policies\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.399641 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-service-ca\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.399690 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-cliconfig\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.399743 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-router-certs\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.400569 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ace1e79f-8e98-4a25-9cbb-1059d87029ff-audit-policies\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.403373 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-cliconfig\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.403504 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-service-ca\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.403590 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.406899 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.415525 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-user-template-error\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.415524 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.415679 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-router-certs\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.417150 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.417199 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-user-template-login\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.417646 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-session\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.418647 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ace1e79f-8e98-4a25-9cbb-1059d87029ff-v4-0-config-system-serving-cert\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.421274 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dd4d\" (UniqueName: \"kubernetes.io/projected/ace1e79f-8e98-4a25-9cbb-1059d87029ff-kube-api-access-4dd4d\") pod \"oauth-openshift-649d76d5b4-5njkq\" (UID: \"ace1e79f-8e98-4a25-9cbb-1059d87029ff\") " pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.445049 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.470175 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.561164 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.643168 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.725147 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.730903 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.762952 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.772825 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.777106 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.899093 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.954113 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 24 02:58:48 crc kubenswrapper[4923]: I0224 02:58:48.958156 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-649d76d5b4-5njkq"] Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.027333 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" event={"ID":"ace1e79f-8e98-4a25-9cbb-1059d87029ff","Type":"ContainerStarted","Data":"ff65a5047e2bec5bddbdbd44ca7a545b827afef4ba7c9345fc8af783fdcefc03"} Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.031566 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.041637 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.222442 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.233096 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.236567 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.351113 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.364087 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.434067 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.505538 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.608171 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.708346 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.718577 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.720216 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb43fea7-b7c2-4c3b-b0ea-9358e14cce29" path="/var/lib/kubelet/pods/fb43fea7-b7c2-4c3b-b0ea-9358e14cce29/volumes" Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.762215 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.845457 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.916736 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.916800 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 02:58:49 crc kubenswrapper[4923]: I0224 02:58:49.976634 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 24 02:58:50 crc kubenswrapper[4923]: I0224 02:58:50.001469 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 24 02:58:50 crc kubenswrapper[4923]: I0224 02:58:50.019982 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 24 02:58:50 crc kubenswrapper[4923]: I0224 02:58:50.033883 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" event={"ID":"ace1e79f-8e98-4a25-9cbb-1059d87029ff","Type":"ContainerStarted","Data":"a64928ea932735d4e5186020906549893503d3f4206f62fd06632e840c136da8"} Feb 24 02:58:50 crc kubenswrapper[4923]: I0224 02:58:50.034114 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:50 crc kubenswrapper[4923]: I0224 02:58:50.040159 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" Feb 24 02:58:50 crc kubenswrapper[4923]: I0224 02:58:50.060173 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-649d76d5b4-5njkq" podStartSLOduration=62.06015495 podStartE2EDuration="1m2.06015495s" podCreationTimestamp="2026-02-24 02:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 02:58:50.05771004 +0000 UTC m=+254.074780863" watchObservedRunningTime="2026-02-24 02:58:50.06015495 +0000 UTC m=+254.077225763" Feb 24 02:58:50 crc kubenswrapper[4923]: I0224 02:58:50.099897 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 24 02:58:50 crc kubenswrapper[4923]: I0224 02:58:50.114053 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 24 02:58:50 crc kubenswrapper[4923]: I0224 02:58:50.220579 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 24 02:58:50 crc kubenswrapper[4923]: I0224 02:58:50.346540 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 24 02:58:50 crc kubenswrapper[4923]: I0224 02:58:50.445279 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 24 02:58:50 crc kubenswrapper[4923]: I0224 02:58:50.507267 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 24 02:58:50 crc kubenswrapper[4923]: I0224 02:58:50.738535 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 24 02:58:50 crc kubenswrapper[4923]: I0224 02:58:50.799804 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 24 02:58:51 crc kubenswrapper[4923]: I0224 02:58:51.689841 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 24 02:58:51 crc kubenswrapper[4923]: I0224 02:58:51.691053 4923 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 24 02:58:52 crc kubenswrapper[4923]: I0224 02:58:52.062395 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 24 02:58:52 crc kubenswrapper[4923]: I0224 02:58:52.295976 4923 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 24 02:58:52 crc kubenswrapper[4923]: I0224 02:58:52.551523 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 24 02:58:52 crc kubenswrapper[4923]: I0224 02:58:52.702768 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 24 02:58:53 crc kubenswrapper[4923]: I0224 02:58:53.107523 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 24 02:58:53 crc kubenswrapper[4923]: I0224 02:58:53.175805 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 24 02:58:53 crc kubenswrapper[4923]: I0224 02:58:53.549438 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 24 02:58:53 crc kubenswrapper[4923]: I0224 02:58:53.813910 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 24 02:58:54 crc kubenswrapper[4923]: I0224 02:58:54.336519 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 24 02:58:54 crc kubenswrapper[4923]: I0224 02:58:54.950661 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 24 02:58:58 crc kubenswrapper[4923]: I0224 02:58:58.645033 4923 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 24 02:58:58 crc kubenswrapper[4923]: I0224 02:58:58.645585 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://efb5273d05e9551b981e371ab6357ed84f96e7b7bc171f3b5b190b64dc508ade" gracePeriod=5 Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.105673 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.105919 4923 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="efb5273d05e9551b981e371ab6357ed84f96e7b7bc171f3b5b190b64dc508ade" exitCode=137 Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.221849 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.222450 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.404898 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.405000 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.405037 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.405059 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.405086 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.405127 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.405158 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.406356 4923 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.406397 4923 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.405162 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.405182 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.414390 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.507059 4923 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.507102 4923 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 24 02:59:04 crc kubenswrapper[4923]: I0224 02:59:04.507112 4923 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 24 02:59:05 crc kubenswrapper[4923]: I0224 02:59:05.115190 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 24 02:59:05 crc kubenswrapper[4923]: I0224 02:59:05.115265 4923 scope.go:117] "RemoveContainer" containerID="efb5273d05e9551b981e371ab6357ed84f96e7b7bc171f3b5b190b64dc508ade" Feb 24 02:59:05 crc kubenswrapper[4923]: I0224 02:59:05.115452 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 02:59:05 crc kubenswrapper[4923]: I0224 02:59:05.720136 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 24 02:59:05 crc kubenswrapper[4923]: I0224 02:59:05.720449 4923 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 24 02:59:05 crc kubenswrapper[4923]: I0224 02:59:05.730058 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 24 02:59:05 crc kubenswrapper[4923]: I0224 02:59:05.730118 4923 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c9ba6937-6bb9-4e4d-99d6-5da262107e7b" Feb 24 02:59:05 crc kubenswrapper[4923]: I0224 02:59:05.733011 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 24 02:59:05 crc kubenswrapper[4923]: I0224 02:59:05.733031 4923 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c9ba6937-6bb9-4e4d-99d6-5da262107e7b" Feb 24 02:59:12 crc kubenswrapper[4923]: I0224 02:59:12.154277 4923 generic.go:334] "Generic (PLEG): container finished" podID="b66c0222-b5e0-4d1e-841e-507c8e61e482" containerID="06fe2428a9f0819074ea52e68d38814f9a286bad2d55f72959e99ae3eb07e8d0" exitCode=0 Feb 24 02:59:12 crc kubenswrapper[4923]: I0224 02:59:12.154340 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" event={"ID":"b66c0222-b5e0-4d1e-841e-507c8e61e482","Type":"ContainerDied","Data":"06fe2428a9f0819074ea52e68d38814f9a286bad2d55f72959e99ae3eb07e8d0"} Feb 24 02:59:12 crc kubenswrapper[4923]: I0224 02:59:12.156050 4923 scope.go:117] "RemoveContainer" containerID="06fe2428a9f0819074ea52e68d38814f9a286bad2d55f72959e99ae3eb07e8d0" Feb 24 02:59:13 crc kubenswrapper[4923]: I0224 02:59:13.167944 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" event={"ID":"b66c0222-b5e0-4d1e-841e-507c8e61e482","Type":"ContainerStarted","Data":"25f301ce6972fe8e7f4c985212f1c7c006925248d1ae9833c8e5ba0c8bcd681e"} Feb 24 02:59:13 crc kubenswrapper[4923]: I0224 02:59:13.168684 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" Feb 24 02:59:13 crc kubenswrapper[4923]: I0224 02:59:13.173828 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" Feb 24 02:59:19 crc kubenswrapper[4923]: I0224 02:59:19.916879 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 02:59:19 crc kubenswrapper[4923]: I0224 02:59:19.917278 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 02:59:37 crc kubenswrapper[4923]: I0224 02:59:37.248938 4923 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 24 02:59:49 crc kubenswrapper[4923]: I0224 02:59:49.916256 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 02:59:49 crc kubenswrapper[4923]: I0224 02:59:49.916896 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 02:59:49 crc kubenswrapper[4923]: I0224 02:59:49.916964 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 02:59:49 crc kubenswrapper[4923]: I0224 02:59:49.917709 4923 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92a19cc205b64e61b6c65d1f93e5df48760062306a031253913e5f685cebe0c6"} pod="openshift-machine-config-operator/machine-config-daemon-rh26t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 02:59:49 crc kubenswrapper[4923]: I0224 02:59:49.917793 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" containerID="cri-o://92a19cc205b64e61b6c65d1f93e5df48760062306a031253913e5f685cebe0c6" gracePeriod=600 Feb 24 02:59:50 crc kubenswrapper[4923]: I0224 02:59:50.381158 4923 generic.go:334] "Generic (PLEG): container finished" podID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerID="92a19cc205b64e61b6c65d1f93e5df48760062306a031253913e5f685cebe0c6" exitCode=0 Feb 24 02:59:50 crc kubenswrapper[4923]: I0224 02:59:50.381260 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerDied","Data":"92a19cc205b64e61b6c65d1f93e5df48760062306a031253913e5f685cebe0c6"} Feb 24 02:59:50 crc kubenswrapper[4923]: I0224 02:59:50.381586 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerStarted","Data":"e2c4fc96d859c5f960586857b5a88ab66b5662816e6463364ed18c251990f0e2"} Feb 24 03:00:00 crc kubenswrapper[4923]: I0224 03:00:00.187104 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv"] Feb 24 03:00:00 crc kubenswrapper[4923]: E0224 03:00:00.187950 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 24 03:00:00 crc kubenswrapper[4923]: I0224 03:00:00.187964 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 24 03:00:00 crc kubenswrapper[4923]: I0224 03:00:00.188063 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 24 03:00:00 crc kubenswrapper[4923]: I0224 03:00:00.188493 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv" Feb 24 03:00:00 crc kubenswrapper[4923]: I0224 03:00:00.190728 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 24 03:00:00 crc kubenswrapper[4923]: I0224 03:00:00.193187 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 24 03:00:00 crc kubenswrapper[4923]: I0224 03:00:00.200946 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv"] Feb 24 03:00:00 crc kubenswrapper[4923]: I0224 03:00:00.294107 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d10f5924-e560-4a3f-bb67-e7e59ab5fd75-secret-volume\") pod \"collect-profiles-29531700-kgcnv\" (UID: \"d10f5924-e560-4a3f-bb67-e7e59ab5fd75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv" Feb 24 03:00:00 crc kubenswrapper[4923]: I0224 03:00:00.294330 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fnn6\" (UniqueName: \"kubernetes.io/projected/d10f5924-e560-4a3f-bb67-e7e59ab5fd75-kube-api-access-8fnn6\") pod \"collect-profiles-29531700-kgcnv\" (UID: \"d10f5924-e560-4a3f-bb67-e7e59ab5fd75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv" Feb 24 03:00:00 crc kubenswrapper[4923]: I0224 03:00:00.294409 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d10f5924-e560-4a3f-bb67-e7e59ab5fd75-config-volume\") pod \"collect-profiles-29531700-kgcnv\" (UID: \"d10f5924-e560-4a3f-bb67-e7e59ab5fd75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv" Feb 24 03:00:00 crc kubenswrapper[4923]: I0224 03:00:00.396201 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d10f5924-e560-4a3f-bb67-e7e59ab5fd75-config-volume\") pod \"collect-profiles-29531700-kgcnv\" (UID: \"d10f5924-e560-4a3f-bb67-e7e59ab5fd75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv" Feb 24 03:00:00 crc kubenswrapper[4923]: I0224 03:00:00.396520 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d10f5924-e560-4a3f-bb67-e7e59ab5fd75-secret-volume\") pod \"collect-profiles-29531700-kgcnv\" (UID: \"d10f5924-e560-4a3f-bb67-e7e59ab5fd75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv" Feb 24 03:00:00 crc kubenswrapper[4923]: I0224 03:00:00.396639 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fnn6\" (UniqueName: \"kubernetes.io/projected/d10f5924-e560-4a3f-bb67-e7e59ab5fd75-kube-api-access-8fnn6\") pod \"collect-profiles-29531700-kgcnv\" (UID: \"d10f5924-e560-4a3f-bb67-e7e59ab5fd75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv" Feb 24 03:00:00 crc kubenswrapper[4923]: I0224 03:00:00.398092 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d10f5924-e560-4a3f-bb67-e7e59ab5fd75-config-volume\") pod \"collect-profiles-29531700-kgcnv\" (UID: \"d10f5924-e560-4a3f-bb67-e7e59ab5fd75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv" Feb 24 03:00:00 crc kubenswrapper[4923]: I0224 03:00:00.402774 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d10f5924-e560-4a3f-bb67-e7e59ab5fd75-secret-volume\") pod \"collect-profiles-29531700-kgcnv\" (UID: \"d10f5924-e560-4a3f-bb67-e7e59ab5fd75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv" Feb 24 03:00:00 crc kubenswrapper[4923]: I0224 03:00:00.432117 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fnn6\" (UniqueName: \"kubernetes.io/projected/d10f5924-e560-4a3f-bb67-e7e59ab5fd75-kube-api-access-8fnn6\") pod \"collect-profiles-29531700-kgcnv\" (UID: \"d10f5924-e560-4a3f-bb67-e7e59ab5fd75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv" Feb 24 03:00:00 crc kubenswrapper[4923]: I0224 03:00:00.515346 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv" Feb 24 03:00:00 crc kubenswrapper[4923]: I0224 03:00:00.713032 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv"] Feb 24 03:00:01 crc kubenswrapper[4923]: I0224 03:00:01.452889 4923 generic.go:334] "Generic (PLEG): container finished" podID="d10f5924-e560-4a3f-bb67-e7e59ab5fd75" containerID="cb82784025bfa1cf0935dd6150ff02544c3422f10b29939953a0c256cd3cd884" exitCode=0 Feb 24 03:00:01 crc kubenswrapper[4923]: I0224 03:00:01.452950 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv" event={"ID":"d10f5924-e560-4a3f-bb67-e7e59ab5fd75","Type":"ContainerDied","Data":"cb82784025bfa1cf0935dd6150ff02544c3422f10b29939953a0c256cd3cd884"} Feb 24 03:00:01 crc kubenswrapper[4923]: I0224 03:00:01.453195 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv" event={"ID":"d10f5924-e560-4a3f-bb67-e7e59ab5fd75","Type":"ContainerStarted","Data":"e6d43a63398e328000a8fcf4ae8c2d715d45c833b66d7215ade817d1c59e1b6d"} Feb 24 03:00:02 crc kubenswrapper[4923]: I0224 03:00:02.690223 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv" Feb 24 03:00:02 crc kubenswrapper[4923]: I0224 03:00:02.735988 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fnn6\" (UniqueName: \"kubernetes.io/projected/d10f5924-e560-4a3f-bb67-e7e59ab5fd75-kube-api-access-8fnn6\") pod \"d10f5924-e560-4a3f-bb67-e7e59ab5fd75\" (UID: \"d10f5924-e560-4a3f-bb67-e7e59ab5fd75\") " Feb 24 03:00:02 crc kubenswrapper[4923]: I0224 03:00:02.736076 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d10f5924-e560-4a3f-bb67-e7e59ab5fd75-config-volume\") pod \"d10f5924-e560-4a3f-bb67-e7e59ab5fd75\" (UID: \"d10f5924-e560-4a3f-bb67-e7e59ab5fd75\") " Feb 24 03:00:02 crc kubenswrapper[4923]: I0224 03:00:02.736147 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d10f5924-e560-4a3f-bb67-e7e59ab5fd75-secret-volume\") pod \"d10f5924-e560-4a3f-bb67-e7e59ab5fd75\" (UID: \"d10f5924-e560-4a3f-bb67-e7e59ab5fd75\") " Feb 24 03:00:02 crc kubenswrapper[4923]: I0224 03:00:02.736922 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d10f5924-e560-4a3f-bb67-e7e59ab5fd75-config-volume" (OuterVolumeSpecName: "config-volume") pod "d10f5924-e560-4a3f-bb67-e7e59ab5fd75" (UID: "d10f5924-e560-4a3f-bb67-e7e59ab5fd75"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:00:02 crc kubenswrapper[4923]: I0224 03:00:02.740947 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d10f5924-e560-4a3f-bb67-e7e59ab5fd75-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d10f5924-e560-4a3f-bb67-e7e59ab5fd75" (UID: "d10f5924-e560-4a3f-bb67-e7e59ab5fd75"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:00:02 crc kubenswrapper[4923]: I0224 03:00:02.741248 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d10f5924-e560-4a3f-bb67-e7e59ab5fd75-kube-api-access-8fnn6" (OuterVolumeSpecName: "kube-api-access-8fnn6") pod "d10f5924-e560-4a3f-bb67-e7e59ab5fd75" (UID: "d10f5924-e560-4a3f-bb67-e7e59ab5fd75"). InnerVolumeSpecName "kube-api-access-8fnn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:00:02 crc kubenswrapper[4923]: I0224 03:00:02.837870 4923 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d10f5924-e560-4a3f-bb67-e7e59ab5fd75-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:02 crc kubenswrapper[4923]: I0224 03:00:02.837904 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fnn6\" (UniqueName: \"kubernetes.io/projected/d10f5924-e560-4a3f-bb67-e7e59ab5fd75-kube-api-access-8fnn6\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:02 crc kubenswrapper[4923]: I0224 03:00:02.837914 4923 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d10f5924-e560-4a3f-bb67-e7e59ab5fd75-config-volume\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:03 crc kubenswrapper[4923]: I0224 03:00:03.466090 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv" event={"ID":"d10f5924-e560-4a3f-bb67-e7e59ab5fd75","Type":"ContainerDied","Data":"e6d43a63398e328000a8fcf4ae8c2d715d45c833b66d7215ade817d1c59e1b6d"} Feb 24 03:00:03 crc kubenswrapper[4923]: I0224 03:00:03.466134 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6d43a63398e328000a8fcf4ae8c2d715d45c833b66d7215ade817d1c59e1b6d" Feb 24 03:00:03 crc kubenswrapper[4923]: I0224 03:00:03.466710 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.695270 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ttlp9"] Feb 24 03:00:05 crc kubenswrapper[4923]: E0224 03:00:05.695855 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d10f5924-e560-4a3f-bb67-e7e59ab5fd75" containerName="collect-profiles" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.695874 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="d10f5924-e560-4a3f-bb67-e7e59ab5fd75" containerName="collect-profiles" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.696006 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="d10f5924-e560-4a3f-bb67-e7e59ab5fd75" containerName="collect-profiles" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.696540 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.712541 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ttlp9"] Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.775053 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/792dcd38-2295-4b70-a00e-8e26d51f5d30-trusted-ca\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.775112 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/792dcd38-2295-4b70-a00e-8e26d51f5d30-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.775142 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/792dcd38-2295-4b70-a00e-8e26d51f5d30-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.775327 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.775412 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/792dcd38-2295-4b70-a00e-8e26d51f5d30-registry-certificates\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.775445 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/792dcd38-2295-4b70-a00e-8e26d51f5d30-registry-tls\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.775504 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qzc4\" (UniqueName: \"kubernetes.io/projected/792dcd38-2295-4b70-a00e-8e26d51f5d30-kube-api-access-9qzc4\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.775537 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/792dcd38-2295-4b70-a00e-8e26d51f5d30-bound-sa-token\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.793830 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.876526 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/792dcd38-2295-4b70-a00e-8e26d51f5d30-trusted-ca\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.876576 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/792dcd38-2295-4b70-a00e-8e26d51f5d30-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.876626 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/792dcd38-2295-4b70-a00e-8e26d51f5d30-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.876658 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/792dcd38-2295-4b70-a00e-8e26d51f5d30-registry-certificates\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.876678 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/792dcd38-2295-4b70-a00e-8e26d51f5d30-registry-tls\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.876705 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qzc4\" (UniqueName: \"kubernetes.io/projected/792dcd38-2295-4b70-a00e-8e26d51f5d30-kube-api-access-9qzc4\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.876722 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/792dcd38-2295-4b70-a00e-8e26d51f5d30-bound-sa-token\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.877275 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/792dcd38-2295-4b70-a00e-8e26d51f5d30-ca-trust-extracted\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.878151 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/792dcd38-2295-4b70-a00e-8e26d51f5d30-trusted-ca\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.878929 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/792dcd38-2295-4b70-a00e-8e26d51f5d30-registry-certificates\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.881447 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/792dcd38-2295-4b70-a00e-8e26d51f5d30-registry-tls\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.881511 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/792dcd38-2295-4b70-a00e-8e26d51f5d30-installation-pull-secrets\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.893861 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qzc4\" (UniqueName: \"kubernetes.io/projected/792dcd38-2295-4b70-a00e-8e26d51f5d30-kube-api-access-9qzc4\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:05 crc kubenswrapper[4923]: I0224 03:00:05.903820 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/792dcd38-2295-4b70-a00e-8e26d51f5d30-bound-sa-token\") pod \"image-registry-66df7c8f76-ttlp9\" (UID: \"792dcd38-2295-4b70-a00e-8e26d51f5d30\") " pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:06 crc kubenswrapper[4923]: I0224 03:00:06.009058 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:06 crc kubenswrapper[4923]: I0224 03:00:06.210975 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-ttlp9"] Feb 24 03:00:06 crc kubenswrapper[4923]: I0224 03:00:06.485022 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" event={"ID":"792dcd38-2295-4b70-a00e-8e26d51f5d30","Type":"ContainerStarted","Data":"3bbb717cc62d393ee7e7d8d56555cf45a37e4328684d943f8246c72e4b2c81c7"} Feb 24 03:00:06 crc kubenswrapper[4923]: I0224 03:00:06.485091 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" event={"ID":"792dcd38-2295-4b70-a00e-8e26d51f5d30","Type":"ContainerStarted","Data":"9e5932f966ceaec65f2b967f8ec8d6311a41be889e4002efa9f8bd62ba061022"} Feb 24 03:00:06 crc kubenswrapper[4923]: I0224 03:00:06.485343 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:06 crc kubenswrapper[4923]: I0224 03:00:06.512817 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" podStartSLOduration=1.512799034 podStartE2EDuration="1.512799034s" podCreationTimestamp="2026-02-24 03:00:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:00:06.509348483 +0000 UTC m=+330.526419326" watchObservedRunningTime="2026-02-24 03:00:06.512799034 +0000 UTC m=+330.529869847" Feb 24 03:00:26 crc kubenswrapper[4923]: I0224 03:00:26.016429 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-ttlp9" Feb 24 03:00:26 crc kubenswrapper[4923]: I0224 03:00:26.101691 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-95gv5"] Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.220945 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t4m2g"] Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.222241 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t4m2g" podUID="906acf28-a57e-4f51-816e-5936cba1548f" containerName="registry-server" containerID="cri-o://24cf1d644eee8cce38b00bac8493aa1c47285f6800d5fe92a39a3698ab45af8c" gracePeriod=30 Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.261350 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qpqqp"] Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.262644 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qpqqp" podUID="844ee205-faee-4873-978e-cf3d64cd8397" containerName="registry-server" containerID="cri-o://62b9ca0aba10dffeb71d443d89eb2f3f7ad39fb8b8cd8e0911807cf0823094f3" gracePeriod=30 Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.271723 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-292s5"] Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.272117 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" podUID="b66c0222-b5e0-4d1e-841e-507c8e61e482" containerName="marketplace-operator" containerID="cri-o://25f301ce6972fe8e7f4c985212f1c7c006925248d1ae9833c8e5ba0c8bcd681e" gracePeriod=30 Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.279323 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5lw8"] Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.281405 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k5lw8" podUID="3c62fecb-b531-4754-a747-902b75b2350d" containerName="registry-server" containerID="cri-o://41abb1a86e2ea27a699e3de57c2edc784a3ecc77c3bd6cce545ba4c1ecc6230c" gracePeriod=30 Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.294985 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-f8kx9"] Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.295951 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-f8kx9" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.298057 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f4g69"] Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.298311 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f4g69" podUID="db365e80-350f-4a1f-955c-5d73c4704241" containerName="registry-server" containerID="cri-o://122c6c6aba7c038b66adad62c1db37a7bb5b010cd818d138a606a5282dda92df" gracePeriod=30 Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.301207 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-f8kx9"] Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.407443 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05612e34-43ff-4719-9bb6-46364400281f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-f8kx9\" (UID: \"05612e34-43ff-4719-9bb6-46364400281f\") " pod="openshift-marketplace/marketplace-operator-79b997595-f8kx9" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.407784 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7qbp\" (UniqueName: \"kubernetes.io/projected/05612e34-43ff-4719-9bb6-46364400281f-kube-api-access-b7qbp\") pod \"marketplace-operator-79b997595-f8kx9\" (UID: \"05612e34-43ff-4719-9bb6-46364400281f\") " pod="openshift-marketplace/marketplace-operator-79b997595-f8kx9" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.407809 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/05612e34-43ff-4719-9bb6-46364400281f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-f8kx9\" (UID: \"05612e34-43ff-4719-9bb6-46364400281f\") " pod="openshift-marketplace/marketplace-operator-79b997595-f8kx9" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.508694 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/05612e34-43ff-4719-9bb6-46364400281f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-f8kx9\" (UID: \"05612e34-43ff-4719-9bb6-46364400281f\") " pod="openshift-marketplace/marketplace-operator-79b997595-f8kx9" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.508815 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05612e34-43ff-4719-9bb6-46364400281f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-f8kx9\" (UID: \"05612e34-43ff-4719-9bb6-46364400281f\") " pod="openshift-marketplace/marketplace-operator-79b997595-f8kx9" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.508856 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7qbp\" (UniqueName: \"kubernetes.io/projected/05612e34-43ff-4719-9bb6-46364400281f-kube-api-access-b7qbp\") pod \"marketplace-operator-79b997595-f8kx9\" (UID: \"05612e34-43ff-4719-9bb6-46364400281f\") " pod="openshift-marketplace/marketplace-operator-79b997595-f8kx9" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.510330 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05612e34-43ff-4719-9bb6-46364400281f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-f8kx9\" (UID: \"05612e34-43ff-4719-9bb6-46364400281f\") " pod="openshift-marketplace/marketplace-operator-79b997595-f8kx9" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.516396 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/05612e34-43ff-4719-9bb6-46364400281f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-f8kx9\" (UID: \"05612e34-43ff-4719-9bb6-46364400281f\") " pod="openshift-marketplace/marketplace-operator-79b997595-f8kx9" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.522408 4923 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-292s5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.522457 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" podUID="b66c0222-b5e0-4d1e-841e-507c8e61e482" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.533326 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7qbp\" (UniqueName: \"kubernetes.io/projected/05612e34-43ff-4719-9bb6-46364400281f-kube-api-access-b7qbp\") pod \"marketplace-operator-79b997595-f8kx9\" (UID: \"05612e34-43ff-4719-9bb6-46364400281f\") " pod="openshift-marketplace/marketplace-operator-79b997595-f8kx9" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.692963 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-f8kx9" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.695911 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4m2g" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.702409 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpqqp" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.706456 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.708155 4923 generic.go:334] "Generic (PLEG): container finished" podID="844ee205-faee-4873-978e-cf3d64cd8397" containerID="62b9ca0aba10dffeb71d443d89eb2f3f7ad39fb8b8cd8e0911807cf0823094f3" exitCode=0 Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.708188 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qpqqp" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.708217 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpqqp" event={"ID":"844ee205-faee-4873-978e-cf3d64cd8397","Type":"ContainerDied","Data":"62b9ca0aba10dffeb71d443d89eb2f3f7ad39fb8b8cd8e0911807cf0823094f3"} Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.708337 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qpqqp" event={"ID":"844ee205-faee-4873-978e-cf3d64cd8397","Type":"ContainerDied","Data":"6e38422ca8a6e962169ac1f5bfcd7b1515a5037100a205c8bf8249cfe2cddcb0"} Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.708356 4923 scope.go:117] "RemoveContainer" containerID="62b9ca0aba10dffeb71d443d89eb2f3f7ad39fb8b8cd8e0911807cf0823094f3" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.720510 4923 generic.go:334] "Generic (PLEG): container finished" podID="3c62fecb-b531-4754-a747-902b75b2350d" containerID="41abb1a86e2ea27a699e3de57c2edc784a3ecc77c3bd6cce545ba4c1ecc6230c" exitCode=0 Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.722459 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f4g69" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.732985 4923 scope.go:117] "RemoveContainer" containerID="5405d746387389ab271470373f19cb3d2345e47bc73ebc10da44cad974910d7e" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.733201 4923 generic.go:334] "Generic (PLEG): container finished" podID="b66c0222-b5e0-4d1e-841e-507c8e61e482" containerID="25f301ce6972fe8e7f4c985212f1c7c006925248d1ae9833c8e5ba0c8bcd681e" exitCode=0 Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.733409 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.742112 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5lw8" event={"ID":"3c62fecb-b531-4754-a747-902b75b2350d","Type":"ContainerDied","Data":"41abb1a86e2ea27a699e3de57c2edc784a3ecc77c3bd6cce545ba4c1ecc6230c"} Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.742155 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5lw8" event={"ID":"3c62fecb-b531-4754-a747-902b75b2350d","Type":"ContainerDied","Data":"56e185d2688874812cad1c013b4bf7af4a5f52606cb6e4eb556e03276fd97891"} Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.742167 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56e185d2688874812cad1c013b4bf7af4a5f52606cb6e4eb556e03276fd97891" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.742176 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" event={"ID":"b66c0222-b5e0-4d1e-841e-507c8e61e482","Type":"ContainerDied","Data":"25f301ce6972fe8e7f4c985212f1c7c006925248d1ae9833c8e5ba0c8bcd681e"} Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.742188 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-292s5" event={"ID":"b66c0222-b5e0-4d1e-841e-507c8e61e482","Type":"ContainerDied","Data":"6d99c7cb2f5a28862baa88829363b97fde2cabc13a9ca38c65e6cf7079139f86"} Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.743364 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5lw8" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.744687 4923 generic.go:334] "Generic (PLEG): container finished" podID="906acf28-a57e-4f51-816e-5936cba1548f" containerID="24cf1d644eee8cce38b00bac8493aa1c47285f6800d5fe92a39a3698ab45af8c" exitCode=0 Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.744781 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4m2g" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.744936 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4m2g" event={"ID":"906acf28-a57e-4f51-816e-5936cba1548f","Type":"ContainerDied","Data":"24cf1d644eee8cce38b00bac8493aa1c47285f6800d5fe92a39a3698ab45af8c"} Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.744985 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4m2g" event={"ID":"906acf28-a57e-4f51-816e-5936cba1548f","Type":"ContainerDied","Data":"2a9f9fabfc140c26678952b582be04b38082d65f75328dade37f46aba3f08d8c"} Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.754258 4923 generic.go:334] "Generic (PLEG): container finished" podID="db365e80-350f-4a1f-955c-5d73c4704241" containerID="122c6c6aba7c038b66adad62c1db37a7bb5b010cd818d138a606a5282dda92df" exitCode=0 Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.754331 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4g69" event={"ID":"db365e80-350f-4a1f-955c-5d73c4704241","Type":"ContainerDied","Data":"122c6c6aba7c038b66adad62c1db37a7bb5b010cd818d138a606a5282dda92df"} Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.754366 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f4g69" event={"ID":"db365e80-350f-4a1f-955c-5d73c4704241","Type":"ContainerDied","Data":"59538f4838b3abea17788e0b977575d63add14727daa5c16d4fc2c73cd96470d"} Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.754721 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f4g69" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.788809 4923 scope.go:117] "RemoveContainer" containerID="a9339cf817cbbde77d284f2abdab640473979b107e8ff53679e1d82b755c26be" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.812034 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/844ee205-faee-4873-978e-cf3d64cd8397-utilities\") pod \"844ee205-faee-4873-978e-cf3d64cd8397\" (UID: \"844ee205-faee-4873-978e-cf3d64cd8397\") " Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.812089 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/906acf28-a57e-4f51-816e-5936cba1548f-catalog-content\") pod \"906acf28-a57e-4f51-816e-5936cba1548f\" (UID: \"906acf28-a57e-4f51-816e-5936cba1548f\") " Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.812118 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/906acf28-a57e-4f51-816e-5936cba1548f-utilities\") pod \"906acf28-a57e-4f51-816e-5936cba1548f\" (UID: \"906acf28-a57e-4f51-816e-5936cba1548f\") " Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.812146 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75dbv\" (UniqueName: \"kubernetes.io/projected/844ee205-faee-4873-978e-cf3d64cd8397-kube-api-access-75dbv\") pod \"844ee205-faee-4873-978e-cf3d64cd8397\" (UID: \"844ee205-faee-4873-978e-cf3d64cd8397\") " Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.812192 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b66c0222-b5e0-4d1e-841e-507c8e61e482-marketplace-operator-metrics\") pod \"b66c0222-b5e0-4d1e-841e-507c8e61e482\" (UID: \"b66c0222-b5e0-4d1e-841e-507c8e61e482\") " Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.812211 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b66c0222-b5e0-4d1e-841e-507c8e61e482-marketplace-trusted-ca\") pod \"b66c0222-b5e0-4d1e-841e-507c8e61e482\" (UID: \"b66c0222-b5e0-4d1e-841e-507c8e61e482\") " Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.812240 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfr2k\" (UniqueName: \"kubernetes.io/projected/b66c0222-b5e0-4d1e-841e-507c8e61e482-kube-api-access-pfr2k\") pod \"b66c0222-b5e0-4d1e-841e-507c8e61e482\" (UID: \"b66c0222-b5e0-4d1e-841e-507c8e61e482\") " Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.812279 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thms4\" (UniqueName: \"kubernetes.io/projected/906acf28-a57e-4f51-816e-5936cba1548f-kube-api-access-thms4\") pod \"906acf28-a57e-4f51-816e-5936cba1548f\" (UID: \"906acf28-a57e-4f51-816e-5936cba1548f\") " Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.812305 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/844ee205-faee-4873-978e-cf3d64cd8397-catalog-content\") pod \"844ee205-faee-4873-978e-cf3d64cd8397\" (UID: \"844ee205-faee-4873-978e-cf3d64cd8397\") " Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.813240 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/844ee205-faee-4873-978e-cf3d64cd8397-utilities" (OuterVolumeSpecName: "utilities") pod "844ee205-faee-4873-978e-cf3d64cd8397" (UID: "844ee205-faee-4873-978e-cf3d64cd8397"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.813326 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/906acf28-a57e-4f51-816e-5936cba1548f-utilities" (OuterVolumeSpecName: "utilities") pod "906acf28-a57e-4f51-816e-5936cba1548f" (UID: "906acf28-a57e-4f51-816e-5936cba1548f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.817948 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b66c0222-b5e0-4d1e-841e-507c8e61e482-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b66c0222-b5e0-4d1e-841e-507c8e61e482" (UID: "b66c0222-b5e0-4d1e-841e-507c8e61e482"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.818628 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/906acf28-a57e-4f51-816e-5936cba1548f-kube-api-access-thms4" (OuterVolumeSpecName: "kube-api-access-thms4") pod "906acf28-a57e-4f51-816e-5936cba1548f" (UID: "906acf28-a57e-4f51-816e-5936cba1548f"). InnerVolumeSpecName "kube-api-access-thms4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.820003 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b66c0222-b5e0-4d1e-841e-507c8e61e482-kube-api-access-pfr2k" (OuterVolumeSpecName: "kube-api-access-pfr2k") pod "b66c0222-b5e0-4d1e-841e-507c8e61e482" (UID: "b66c0222-b5e0-4d1e-841e-507c8e61e482"). InnerVolumeSpecName "kube-api-access-pfr2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.820086 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b66c0222-b5e0-4d1e-841e-507c8e61e482-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b66c0222-b5e0-4d1e-841e-507c8e61e482" (UID: "b66c0222-b5e0-4d1e-841e-507c8e61e482"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.820171 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/844ee205-faee-4873-978e-cf3d64cd8397-kube-api-access-75dbv" (OuterVolumeSpecName: "kube-api-access-75dbv") pod "844ee205-faee-4873-978e-cf3d64cd8397" (UID: "844ee205-faee-4873-978e-cf3d64cd8397"). InnerVolumeSpecName "kube-api-access-75dbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.828468 4923 scope.go:117] "RemoveContainer" containerID="62b9ca0aba10dffeb71d443d89eb2f3f7ad39fb8b8cd8e0911807cf0823094f3" Feb 24 03:00:45 crc kubenswrapper[4923]: E0224 03:00:45.829366 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b9ca0aba10dffeb71d443d89eb2f3f7ad39fb8b8cd8e0911807cf0823094f3\": container with ID starting with 62b9ca0aba10dffeb71d443d89eb2f3f7ad39fb8b8cd8e0911807cf0823094f3 not found: ID does not exist" containerID="62b9ca0aba10dffeb71d443d89eb2f3f7ad39fb8b8cd8e0911807cf0823094f3" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.829406 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b9ca0aba10dffeb71d443d89eb2f3f7ad39fb8b8cd8e0911807cf0823094f3"} err="failed to get container status \"62b9ca0aba10dffeb71d443d89eb2f3f7ad39fb8b8cd8e0911807cf0823094f3\": rpc error: code = NotFound desc = could not find container \"62b9ca0aba10dffeb71d443d89eb2f3f7ad39fb8b8cd8e0911807cf0823094f3\": container with ID starting with 62b9ca0aba10dffeb71d443d89eb2f3f7ad39fb8b8cd8e0911807cf0823094f3 not found: ID does not exist" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.829431 4923 scope.go:117] "RemoveContainer" containerID="5405d746387389ab271470373f19cb3d2345e47bc73ebc10da44cad974910d7e" Feb 24 03:00:45 crc kubenswrapper[4923]: E0224 03:00:45.829745 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5405d746387389ab271470373f19cb3d2345e47bc73ebc10da44cad974910d7e\": container with ID starting with 5405d746387389ab271470373f19cb3d2345e47bc73ebc10da44cad974910d7e not found: ID does not exist" containerID="5405d746387389ab271470373f19cb3d2345e47bc73ebc10da44cad974910d7e" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.829784 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5405d746387389ab271470373f19cb3d2345e47bc73ebc10da44cad974910d7e"} err="failed to get container status \"5405d746387389ab271470373f19cb3d2345e47bc73ebc10da44cad974910d7e\": rpc error: code = NotFound desc = could not find container \"5405d746387389ab271470373f19cb3d2345e47bc73ebc10da44cad974910d7e\": container with ID starting with 5405d746387389ab271470373f19cb3d2345e47bc73ebc10da44cad974910d7e not found: ID does not exist" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.829811 4923 scope.go:117] "RemoveContainer" containerID="a9339cf817cbbde77d284f2abdab640473979b107e8ff53679e1d82b755c26be" Feb 24 03:00:45 crc kubenswrapper[4923]: E0224 03:00:45.830031 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9339cf817cbbde77d284f2abdab640473979b107e8ff53679e1d82b755c26be\": container with ID starting with a9339cf817cbbde77d284f2abdab640473979b107e8ff53679e1d82b755c26be not found: ID does not exist" containerID="a9339cf817cbbde77d284f2abdab640473979b107e8ff53679e1d82b755c26be" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.830049 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9339cf817cbbde77d284f2abdab640473979b107e8ff53679e1d82b755c26be"} err="failed to get container status \"a9339cf817cbbde77d284f2abdab640473979b107e8ff53679e1d82b755c26be\": rpc error: code = NotFound desc = could not find container \"a9339cf817cbbde77d284f2abdab640473979b107e8ff53679e1d82b755c26be\": container with ID starting with a9339cf817cbbde77d284f2abdab640473979b107e8ff53679e1d82b755c26be not found: ID does not exist" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.830063 4923 scope.go:117] "RemoveContainer" containerID="25f301ce6972fe8e7f4c985212f1c7c006925248d1ae9833c8e5ba0c8bcd681e" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.845840 4923 scope.go:117] "RemoveContainer" containerID="06fe2428a9f0819074ea52e68d38814f9a286bad2d55f72959e99ae3eb07e8d0" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.866070 4923 scope.go:117] "RemoveContainer" containerID="25f301ce6972fe8e7f4c985212f1c7c006925248d1ae9833c8e5ba0c8bcd681e" Feb 24 03:00:45 crc kubenswrapper[4923]: E0224 03:00:45.866645 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f301ce6972fe8e7f4c985212f1c7c006925248d1ae9833c8e5ba0c8bcd681e\": container with ID starting with 25f301ce6972fe8e7f4c985212f1c7c006925248d1ae9833c8e5ba0c8bcd681e not found: ID does not exist" containerID="25f301ce6972fe8e7f4c985212f1c7c006925248d1ae9833c8e5ba0c8bcd681e" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.866673 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f301ce6972fe8e7f4c985212f1c7c006925248d1ae9833c8e5ba0c8bcd681e"} err="failed to get container status \"25f301ce6972fe8e7f4c985212f1c7c006925248d1ae9833c8e5ba0c8bcd681e\": rpc error: code = NotFound desc = could not find container \"25f301ce6972fe8e7f4c985212f1c7c006925248d1ae9833c8e5ba0c8bcd681e\": container with ID starting with 25f301ce6972fe8e7f4c985212f1c7c006925248d1ae9833c8e5ba0c8bcd681e not found: ID does not exist" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.866693 4923 scope.go:117] "RemoveContainer" containerID="06fe2428a9f0819074ea52e68d38814f9a286bad2d55f72959e99ae3eb07e8d0" Feb 24 03:00:45 crc kubenswrapper[4923]: E0224 03:00:45.866926 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06fe2428a9f0819074ea52e68d38814f9a286bad2d55f72959e99ae3eb07e8d0\": container with ID starting with 06fe2428a9f0819074ea52e68d38814f9a286bad2d55f72959e99ae3eb07e8d0 not found: ID does not exist" containerID="06fe2428a9f0819074ea52e68d38814f9a286bad2d55f72959e99ae3eb07e8d0" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.866943 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06fe2428a9f0819074ea52e68d38814f9a286bad2d55f72959e99ae3eb07e8d0"} err="failed to get container status \"06fe2428a9f0819074ea52e68d38814f9a286bad2d55f72959e99ae3eb07e8d0\": rpc error: code = NotFound desc = could not find container \"06fe2428a9f0819074ea52e68d38814f9a286bad2d55f72959e99ae3eb07e8d0\": container with ID starting with 06fe2428a9f0819074ea52e68d38814f9a286bad2d55f72959e99ae3eb07e8d0 not found: ID does not exist" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.866955 4923 scope.go:117] "RemoveContainer" containerID="24cf1d644eee8cce38b00bac8493aa1c47285f6800d5fe92a39a3698ab45af8c" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.882941 4923 scope.go:117] "RemoveContainer" containerID="6b9c908a4e255ca83d91e64b8b20484f264be8075f44544eeef8a3488e63ce72" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.896020 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/844ee205-faee-4873-978e-cf3d64cd8397-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "844ee205-faee-4873-978e-cf3d64cd8397" (UID: "844ee205-faee-4873-978e-cf3d64cd8397"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.900991 4923 scope.go:117] "RemoveContainer" containerID="ddc663eee94e5aff1d89fddabba337ccc8aa328ffa9f5ecae727211726ba178c" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.902633 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/906acf28-a57e-4f51-816e-5936cba1548f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "906acf28-a57e-4f51-816e-5936cba1548f" (UID: "906acf28-a57e-4f51-816e-5936cba1548f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.912994 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c62fecb-b531-4754-a747-902b75b2350d-utilities\") pod \"3c62fecb-b531-4754-a747-902b75b2350d\" (UID: \"3c62fecb-b531-4754-a747-902b75b2350d\") " Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.913023 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db365e80-350f-4a1f-955c-5d73c4704241-utilities\") pod \"db365e80-350f-4a1f-955c-5d73c4704241\" (UID: \"db365e80-350f-4a1f-955c-5d73c4704241\") " Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.913043 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c62fecb-b531-4754-a747-902b75b2350d-catalog-content\") pod \"3c62fecb-b531-4754-a747-902b75b2350d\" (UID: \"3c62fecb-b531-4754-a747-902b75b2350d\") " Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.913065 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db365e80-350f-4a1f-955c-5d73c4704241-catalog-content\") pod \"db365e80-350f-4a1f-955c-5d73c4704241\" (UID: \"db365e80-350f-4a1f-955c-5d73c4704241\") " Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.913084 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzrh8\" (UniqueName: \"kubernetes.io/projected/db365e80-350f-4a1f-955c-5d73c4704241-kube-api-access-lzrh8\") pod \"db365e80-350f-4a1f-955c-5d73c4704241\" (UID: \"db365e80-350f-4a1f-955c-5d73c4704241\") " Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.913182 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf9mj\" (UniqueName: \"kubernetes.io/projected/3c62fecb-b531-4754-a747-902b75b2350d-kube-api-access-gf9mj\") pod \"3c62fecb-b531-4754-a747-902b75b2350d\" (UID: \"3c62fecb-b531-4754-a747-902b75b2350d\") " Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.913406 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/844ee205-faee-4873-978e-cf3d64cd8397-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.913422 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/906acf28-a57e-4f51-816e-5936cba1548f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.913432 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/906acf28-a57e-4f51-816e-5936cba1548f-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.913441 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75dbv\" (UniqueName: \"kubernetes.io/projected/844ee205-faee-4873-978e-cf3d64cd8397-kube-api-access-75dbv\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.913452 4923 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b66c0222-b5e0-4d1e-841e-507c8e61e482-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.913464 4923 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b66c0222-b5e0-4d1e-841e-507c8e61e482-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.913476 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfr2k\" (UniqueName: \"kubernetes.io/projected/b66c0222-b5e0-4d1e-841e-507c8e61e482-kube-api-access-pfr2k\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.913485 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thms4\" (UniqueName: \"kubernetes.io/projected/906acf28-a57e-4f51-816e-5936cba1548f-kube-api-access-thms4\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.913495 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/844ee205-faee-4873-978e-cf3d64cd8397-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.913902 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c62fecb-b531-4754-a747-902b75b2350d-utilities" (OuterVolumeSpecName: "utilities") pod "3c62fecb-b531-4754-a747-902b75b2350d" (UID: "3c62fecb-b531-4754-a747-902b75b2350d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.915796 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db365e80-350f-4a1f-955c-5d73c4704241-utilities" (OuterVolumeSpecName: "utilities") pod "db365e80-350f-4a1f-955c-5d73c4704241" (UID: "db365e80-350f-4a1f-955c-5d73c4704241"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.929633 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c62fecb-b531-4754-a747-902b75b2350d-kube-api-access-gf9mj" (OuterVolumeSpecName: "kube-api-access-gf9mj") pod "3c62fecb-b531-4754-a747-902b75b2350d" (UID: "3c62fecb-b531-4754-a747-902b75b2350d"). InnerVolumeSpecName "kube-api-access-gf9mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.929655 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db365e80-350f-4a1f-955c-5d73c4704241-kube-api-access-lzrh8" (OuterVolumeSpecName: "kube-api-access-lzrh8") pod "db365e80-350f-4a1f-955c-5d73c4704241" (UID: "db365e80-350f-4a1f-955c-5d73c4704241"). InnerVolumeSpecName "kube-api-access-lzrh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.929809 4923 scope.go:117] "RemoveContainer" containerID="24cf1d644eee8cce38b00bac8493aa1c47285f6800d5fe92a39a3698ab45af8c" Feb 24 03:00:45 crc kubenswrapper[4923]: E0224 03:00:45.930280 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24cf1d644eee8cce38b00bac8493aa1c47285f6800d5fe92a39a3698ab45af8c\": container with ID starting with 24cf1d644eee8cce38b00bac8493aa1c47285f6800d5fe92a39a3698ab45af8c not found: ID does not exist" containerID="24cf1d644eee8cce38b00bac8493aa1c47285f6800d5fe92a39a3698ab45af8c" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.930322 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24cf1d644eee8cce38b00bac8493aa1c47285f6800d5fe92a39a3698ab45af8c"} err="failed to get container status \"24cf1d644eee8cce38b00bac8493aa1c47285f6800d5fe92a39a3698ab45af8c\": rpc error: code = NotFound desc = could not find container \"24cf1d644eee8cce38b00bac8493aa1c47285f6800d5fe92a39a3698ab45af8c\": container with ID starting with 24cf1d644eee8cce38b00bac8493aa1c47285f6800d5fe92a39a3698ab45af8c not found: ID does not exist" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.930343 4923 scope.go:117] "RemoveContainer" containerID="6b9c908a4e255ca83d91e64b8b20484f264be8075f44544eeef8a3488e63ce72" Feb 24 03:00:45 crc kubenswrapper[4923]: E0224 03:00:45.930581 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b9c908a4e255ca83d91e64b8b20484f264be8075f44544eeef8a3488e63ce72\": container with ID starting with 6b9c908a4e255ca83d91e64b8b20484f264be8075f44544eeef8a3488e63ce72 not found: ID does not exist" containerID="6b9c908a4e255ca83d91e64b8b20484f264be8075f44544eeef8a3488e63ce72" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.930596 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b9c908a4e255ca83d91e64b8b20484f264be8075f44544eeef8a3488e63ce72"} err="failed to get container status \"6b9c908a4e255ca83d91e64b8b20484f264be8075f44544eeef8a3488e63ce72\": rpc error: code = NotFound desc = could not find container \"6b9c908a4e255ca83d91e64b8b20484f264be8075f44544eeef8a3488e63ce72\": container with ID starting with 6b9c908a4e255ca83d91e64b8b20484f264be8075f44544eeef8a3488e63ce72 not found: ID does not exist" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.930608 4923 scope.go:117] "RemoveContainer" containerID="ddc663eee94e5aff1d89fddabba337ccc8aa328ffa9f5ecae727211726ba178c" Feb 24 03:00:45 crc kubenswrapper[4923]: E0224 03:00:45.930811 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddc663eee94e5aff1d89fddabba337ccc8aa328ffa9f5ecae727211726ba178c\": container with ID starting with ddc663eee94e5aff1d89fddabba337ccc8aa328ffa9f5ecae727211726ba178c not found: ID does not exist" containerID="ddc663eee94e5aff1d89fddabba337ccc8aa328ffa9f5ecae727211726ba178c" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.930826 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc663eee94e5aff1d89fddabba337ccc8aa328ffa9f5ecae727211726ba178c"} err="failed to get container status \"ddc663eee94e5aff1d89fddabba337ccc8aa328ffa9f5ecae727211726ba178c\": rpc error: code = NotFound desc = could not find container \"ddc663eee94e5aff1d89fddabba337ccc8aa328ffa9f5ecae727211726ba178c\": container with ID starting with ddc663eee94e5aff1d89fddabba337ccc8aa328ffa9f5ecae727211726ba178c not found: ID does not exist" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.930839 4923 scope.go:117] "RemoveContainer" containerID="122c6c6aba7c038b66adad62c1db37a7bb5b010cd818d138a606a5282dda92df" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.943031 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c62fecb-b531-4754-a747-902b75b2350d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c62fecb-b531-4754-a747-902b75b2350d" (UID: "3c62fecb-b531-4754-a747-902b75b2350d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.948416 4923 scope.go:117] "RemoveContainer" containerID="68b4fb56c83c0e013c92f2b4081051f55caad5954d96abeef76952b6b1026995" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.970515 4923 scope.go:117] "RemoveContainer" containerID="4c00fd99a0fbd4bb2e0ea65e141e8ba1427f09b506ed0b13f29f8f70888cfb58" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.975208 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-f8kx9"] Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.987610 4923 scope.go:117] "RemoveContainer" containerID="122c6c6aba7c038b66adad62c1db37a7bb5b010cd818d138a606a5282dda92df" Feb 24 03:00:45 crc kubenswrapper[4923]: E0224 03:00:45.987998 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"122c6c6aba7c038b66adad62c1db37a7bb5b010cd818d138a606a5282dda92df\": container with ID starting with 122c6c6aba7c038b66adad62c1db37a7bb5b010cd818d138a606a5282dda92df not found: ID does not exist" containerID="122c6c6aba7c038b66adad62c1db37a7bb5b010cd818d138a606a5282dda92df" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.988029 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"122c6c6aba7c038b66adad62c1db37a7bb5b010cd818d138a606a5282dda92df"} err="failed to get container status \"122c6c6aba7c038b66adad62c1db37a7bb5b010cd818d138a606a5282dda92df\": rpc error: code = NotFound desc = could not find container \"122c6c6aba7c038b66adad62c1db37a7bb5b010cd818d138a606a5282dda92df\": container with ID starting with 122c6c6aba7c038b66adad62c1db37a7bb5b010cd818d138a606a5282dda92df not found: ID does not exist" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.988065 4923 scope.go:117] "RemoveContainer" containerID="68b4fb56c83c0e013c92f2b4081051f55caad5954d96abeef76952b6b1026995" Feb 24 03:00:45 crc kubenswrapper[4923]: E0224 03:00:45.988351 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68b4fb56c83c0e013c92f2b4081051f55caad5954d96abeef76952b6b1026995\": container with ID starting with 68b4fb56c83c0e013c92f2b4081051f55caad5954d96abeef76952b6b1026995 not found: ID does not exist" containerID="68b4fb56c83c0e013c92f2b4081051f55caad5954d96abeef76952b6b1026995" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.988368 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68b4fb56c83c0e013c92f2b4081051f55caad5954d96abeef76952b6b1026995"} err="failed to get container status \"68b4fb56c83c0e013c92f2b4081051f55caad5954d96abeef76952b6b1026995\": rpc error: code = NotFound desc = could not find container \"68b4fb56c83c0e013c92f2b4081051f55caad5954d96abeef76952b6b1026995\": container with ID starting with 68b4fb56c83c0e013c92f2b4081051f55caad5954d96abeef76952b6b1026995 not found: ID does not exist" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.988402 4923 scope.go:117] "RemoveContainer" containerID="4c00fd99a0fbd4bb2e0ea65e141e8ba1427f09b506ed0b13f29f8f70888cfb58" Feb 24 03:00:45 crc kubenswrapper[4923]: E0224 03:00:45.988769 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c00fd99a0fbd4bb2e0ea65e141e8ba1427f09b506ed0b13f29f8f70888cfb58\": container with ID starting with 4c00fd99a0fbd4bb2e0ea65e141e8ba1427f09b506ed0b13f29f8f70888cfb58 not found: ID does not exist" containerID="4c00fd99a0fbd4bb2e0ea65e141e8ba1427f09b506ed0b13f29f8f70888cfb58" Feb 24 03:00:45 crc kubenswrapper[4923]: I0224 03:00:45.988824 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c00fd99a0fbd4bb2e0ea65e141e8ba1427f09b506ed0b13f29f8f70888cfb58"} err="failed to get container status \"4c00fd99a0fbd4bb2e0ea65e141e8ba1427f09b506ed0b13f29f8f70888cfb58\": rpc error: code = NotFound desc = could not find container \"4c00fd99a0fbd4bb2e0ea65e141e8ba1427f09b506ed0b13f29f8f70888cfb58\": container with ID starting with 4c00fd99a0fbd4bb2e0ea65e141e8ba1427f09b506ed0b13f29f8f70888cfb58 not found: ID does not exist" Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.015793 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db365e80-350f-4a1f-955c-5d73c4704241-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.015832 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c62fecb-b531-4754-a747-902b75b2350d-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.015847 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c62fecb-b531-4754-a747-902b75b2350d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.015863 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzrh8\" (UniqueName: \"kubernetes.io/projected/db365e80-350f-4a1f-955c-5d73c4704241-kube-api-access-lzrh8\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.015875 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf9mj\" (UniqueName: \"kubernetes.io/projected/3c62fecb-b531-4754-a747-902b75b2350d-kube-api-access-gf9mj\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.047085 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db365e80-350f-4a1f-955c-5d73c4704241-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db365e80-350f-4a1f-955c-5d73c4704241" (UID: "db365e80-350f-4a1f-955c-5d73c4704241"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.068108 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qpqqp"] Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.071941 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qpqqp"] Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.081951 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-292s5"] Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.085214 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-292s5"] Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.108648 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t4m2g"] Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.116931 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t4m2g"] Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.119358 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db365e80-350f-4a1f-955c-5d73c4704241-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.128586 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f4g69"] Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.130434 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f4g69"] Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.763129 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-f8kx9" event={"ID":"05612e34-43ff-4719-9bb6-46364400281f","Type":"ContainerStarted","Data":"d94b3d4c663c9be709d3d10cf2dfd0055ed190d819a69f84266623039cdd34e1"} Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.763174 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-f8kx9" event={"ID":"05612e34-43ff-4719-9bb6-46364400281f","Type":"ContainerStarted","Data":"63d3cc9634c60949542ec390e757e1325d906915f7cb17054471ba8c84347914"} Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.764514 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-f8kx9" Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.766869 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-f8kx9" Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.768220 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5lw8" Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.795862 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-f8kx9" podStartSLOduration=1.795843174 podStartE2EDuration="1.795843174s" podCreationTimestamp="2026-02-24 03:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:00:46.781124393 +0000 UTC m=+370.798195266" watchObservedRunningTime="2026-02-24 03:00:46.795843174 +0000 UTC m=+370.812913987" Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.874476 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5lw8"] Feb 24 03:00:46 crc kubenswrapper[4923]: I0224 03:00:46.877768 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5lw8"] Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.433889 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s2s2b"] Feb 24 03:00:47 crc kubenswrapper[4923]: E0224 03:00:47.434073 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906acf28-a57e-4f51-816e-5936cba1548f" containerName="extract-utilities" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434084 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="906acf28-a57e-4f51-816e-5936cba1548f" containerName="extract-utilities" Feb 24 03:00:47 crc kubenswrapper[4923]: E0224 03:00:47.434096 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db365e80-350f-4a1f-955c-5d73c4704241" containerName="extract-content" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434103 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="db365e80-350f-4a1f-955c-5d73c4704241" containerName="extract-content" Feb 24 03:00:47 crc kubenswrapper[4923]: E0224 03:00:47.434111 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="844ee205-faee-4873-978e-cf3d64cd8397" containerName="registry-server" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434118 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="844ee205-faee-4873-978e-cf3d64cd8397" containerName="registry-server" Feb 24 03:00:47 crc kubenswrapper[4923]: E0224 03:00:47.434127 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db365e80-350f-4a1f-955c-5d73c4704241" containerName="registry-server" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434133 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="db365e80-350f-4a1f-955c-5d73c4704241" containerName="registry-server" Feb 24 03:00:47 crc kubenswrapper[4923]: E0224 03:00:47.434143 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66c0222-b5e0-4d1e-841e-507c8e61e482" containerName="marketplace-operator" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434148 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66c0222-b5e0-4d1e-841e-507c8e61e482" containerName="marketplace-operator" Feb 24 03:00:47 crc kubenswrapper[4923]: E0224 03:00:47.434155 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c62fecb-b531-4754-a747-902b75b2350d" containerName="extract-content" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434160 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c62fecb-b531-4754-a747-902b75b2350d" containerName="extract-content" Feb 24 03:00:47 crc kubenswrapper[4923]: E0224 03:00:47.434167 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="844ee205-faee-4873-978e-cf3d64cd8397" containerName="extract-utilities" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434172 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="844ee205-faee-4873-978e-cf3d64cd8397" containerName="extract-utilities" Feb 24 03:00:47 crc kubenswrapper[4923]: E0224 03:00:47.434181 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db365e80-350f-4a1f-955c-5d73c4704241" containerName="extract-utilities" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434187 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="db365e80-350f-4a1f-955c-5d73c4704241" containerName="extract-utilities" Feb 24 03:00:47 crc kubenswrapper[4923]: E0224 03:00:47.434195 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="844ee205-faee-4873-978e-cf3d64cd8397" containerName="extract-content" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434200 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="844ee205-faee-4873-978e-cf3d64cd8397" containerName="extract-content" Feb 24 03:00:47 crc kubenswrapper[4923]: E0224 03:00:47.434208 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906acf28-a57e-4f51-816e-5936cba1548f" containerName="extract-content" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434213 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="906acf28-a57e-4f51-816e-5936cba1548f" containerName="extract-content" Feb 24 03:00:47 crc kubenswrapper[4923]: E0224 03:00:47.434220 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c62fecb-b531-4754-a747-902b75b2350d" containerName="extract-utilities" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434226 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c62fecb-b531-4754-a747-902b75b2350d" containerName="extract-utilities" Feb 24 03:00:47 crc kubenswrapper[4923]: E0224 03:00:47.434234 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906acf28-a57e-4f51-816e-5936cba1548f" containerName="registry-server" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434240 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="906acf28-a57e-4f51-816e-5936cba1548f" containerName="registry-server" Feb 24 03:00:47 crc kubenswrapper[4923]: E0224 03:00:47.434247 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c62fecb-b531-4754-a747-902b75b2350d" containerName="registry-server" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434253 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c62fecb-b531-4754-a747-902b75b2350d" containerName="registry-server" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434355 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="906acf28-a57e-4f51-816e-5936cba1548f" containerName="registry-server" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434364 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="844ee205-faee-4873-978e-cf3d64cd8397" containerName="registry-server" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434373 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="db365e80-350f-4a1f-955c-5d73c4704241" containerName="registry-server" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434382 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="b66c0222-b5e0-4d1e-841e-507c8e61e482" containerName="marketplace-operator" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434388 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="b66c0222-b5e0-4d1e-841e-507c8e61e482" containerName="marketplace-operator" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434396 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c62fecb-b531-4754-a747-902b75b2350d" containerName="registry-server" Feb 24 03:00:47 crc kubenswrapper[4923]: E0224 03:00:47.434483 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66c0222-b5e0-4d1e-841e-507c8e61e482" containerName="marketplace-operator" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.434490 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66c0222-b5e0-4d1e-841e-507c8e61e482" containerName="marketplace-operator" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.435061 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s2s2b" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.439663 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.443488 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2s2b"] Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.465238 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ade27f6-4909-4b58-b29d-d7b74686d166-utilities\") pod \"redhat-marketplace-s2s2b\" (UID: \"3ade27f6-4909-4b58-b29d-d7b74686d166\") " pod="openshift-marketplace/redhat-marketplace-s2s2b" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.465290 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ade27f6-4909-4b58-b29d-d7b74686d166-catalog-content\") pod \"redhat-marketplace-s2s2b\" (UID: \"3ade27f6-4909-4b58-b29d-d7b74686d166\") " pod="openshift-marketplace/redhat-marketplace-s2s2b" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.465357 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb5fg\" (UniqueName: \"kubernetes.io/projected/3ade27f6-4909-4b58-b29d-d7b74686d166-kube-api-access-mb5fg\") pod \"redhat-marketplace-s2s2b\" (UID: \"3ade27f6-4909-4b58-b29d-d7b74686d166\") " pod="openshift-marketplace/redhat-marketplace-s2s2b" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.566499 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ade27f6-4909-4b58-b29d-d7b74686d166-utilities\") pod \"redhat-marketplace-s2s2b\" (UID: \"3ade27f6-4909-4b58-b29d-d7b74686d166\") " pod="openshift-marketplace/redhat-marketplace-s2s2b" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.566551 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ade27f6-4909-4b58-b29d-d7b74686d166-catalog-content\") pod \"redhat-marketplace-s2s2b\" (UID: \"3ade27f6-4909-4b58-b29d-d7b74686d166\") " pod="openshift-marketplace/redhat-marketplace-s2s2b" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.566582 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb5fg\" (UniqueName: \"kubernetes.io/projected/3ade27f6-4909-4b58-b29d-d7b74686d166-kube-api-access-mb5fg\") pod \"redhat-marketplace-s2s2b\" (UID: \"3ade27f6-4909-4b58-b29d-d7b74686d166\") " pod="openshift-marketplace/redhat-marketplace-s2s2b" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.566953 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ade27f6-4909-4b58-b29d-d7b74686d166-utilities\") pod \"redhat-marketplace-s2s2b\" (UID: \"3ade27f6-4909-4b58-b29d-d7b74686d166\") " pod="openshift-marketplace/redhat-marketplace-s2s2b" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.567035 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ade27f6-4909-4b58-b29d-d7b74686d166-catalog-content\") pod \"redhat-marketplace-s2s2b\" (UID: \"3ade27f6-4909-4b58-b29d-d7b74686d166\") " pod="openshift-marketplace/redhat-marketplace-s2s2b" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.585354 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb5fg\" (UniqueName: \"kubernetes.io/projected/3ade27f6-4909-4b58-b29d-d7b74686d166-kube-api-access-mb5fg\") pod \"redhat-marketplace-s2s2b\" (UID: \"3ade27f6-4909-4b58-b29d-d7b74686d166\") " pod="openshift-marketplace/redhat-marketplace-s2s2b" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.637362 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6c8kn"] Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.639101 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6c8kn" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.640603 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.648272 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6c8kn"] Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.668574 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nttrt\" (UniqueName: \"kubernetes.io/projected/3bfb8ad5-974b-4507-96cf-1150c1ca8937-kube-api-access-nttrt\") pod \"redhat-operators-6c8kn\" (UID: \"3bfb8ad5-974b-4507-96cf-1150c1ca8937\") " pod="openshift-marketplace/redhat-operators-6c8kn" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.668636 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bfb8ad5-974b-4507-96cf-1150c1ca8937-catalog-content\") pod \"redhat-operators-6c8kn\" (UID: \"3bfb8ad5-974b-4507-96cf-1150c1ca8937\") " pod="openshift-marketplace/redhat-operators-6c8kn" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.668689 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bfb8ad5-974b-4507-96cf-1150c1ca8937-utilities\") pod \"redhat-operators-6c8kn\" (UID: \"3bfb8ad5-974b-4507-96cf-1150c1ca8937\") " pod="openshift-marketplace/redhat-operators-6c8kn" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.719348 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c62fecb-b531-4754-a747-902b75b2350d" path="/var/lib/kubelet/pods/3c62fecb-b531-4754-a747-902b75b2350d/volumes" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.719938 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="844ee205-faee-4873-978e-cf3d64cd8397" path="/var/lib/kubelet/pods/844ee205-faee-4873-978e-cf3d64cd8397/volumes" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.720537 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="906acf28-a57e-4f51-816e-5936cba1548f" path="/var/lib/kubelet/pods/906acf28-a57e-4f51-816e-5936cba1548f/volumes" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.721491 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b66c0222-b5e0-4d1e-841e-507c8e61e482" path="/var/lib/kubelet/pods/b66c0222-b5e0-4d1e-841e-507c8e61e482/volumes" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.721933 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db365e80-350f-4a1f-955c-5d73c4704241" path="/var/lib/kubelet/pods/db365e80-350f-4a1f-955c-5d73c4704241/volumes" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.751055 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s2s2b" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.769569 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bfb8ad5-974b-4507-96cf-1150c1ca8937-utilities\") pod \"redhat-operators-6c8kn\" (UID: \"3bfb8ad5-974b-4507-96cf-1150c1ca8937\") " pod="openshift-marketplace/redhat-operators-6c8kn" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.769672 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nttrt\" (UniqueName: \"kubernetes.io/projected/3bfb8ad5-974b-4507-96cf-1150c1ca8937-kube-api-access-nttrt\") pod \"redhat-operators-6c8kn\" (UID: \"3bfb8ad5-974b-4507-96cf-1150c1ca8937\") " pod="openshift-marketplace/redhat-operators-6c8kn" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.769704 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bfb8ad5-974b-4507-96cf-1150c1ca8937-catalog-content\") pod \"redhat-operators-6c8kn\" (UID: \"3bfb8ad5-974b-4507-96cf-1150c1ca8937\") " pod="openshift-marketplace/redhat-operators-6c8kn" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.770899 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bfb8ad5-974b-4507-96cf-1150c1ca8937-utilities\") pod \"redhat-operators-6c8kn\" (UID: \"3bfb8ad5-974b-4507-96cf-1150c1ca8937\") " pod="openshift-marketplace/redhat-operators-6c8kn" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.771380 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bfb8ad5-974b-4507-96cf-1150c1ca8937-catalog-content\") pod \"redhat-operators-6c8kn\" (UID: \"3bfb8ad5-974b-4507-96cf-1150c1ca8937\") " pod="openshift-marketplace/redhat-operators-6c8kn" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.788029 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nttrt\" (UniqueName: \"kubernetes.io/projected/3bfb8ad5-974b-4507-96cf-1150c1ca8937-kube-api-access-nttrt\") pod \"redhat-operators-6c8kn\" (UID: \"3bfb8ad5-974b-4507-96cf-1150c1ca8937\") " pod="openshift-marketplace/redhat-operators-6c8kn" Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.929156 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2s2b"] Feb 24 03:00:47 crc kubenswrapper[4923]: W0224 03:00:47.939443 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ade27f6_4909_4b58_b29d_d7b74686d166.slice/crio-035f3b24963de3069662946af5198a80e490bbf2284966f48a56433dac8fc492 WatchSource:0}: Error finding container 035f3b24963de3069662946af5198a80e490bbf2284966f48a56433dac8fc492: Status 404 returned error can't find the container with id 035f3b24963de3069662946af5198a80e490bbf2284966f48a56433dac8fc492 Feb 24 03:00:47 crc kubenswrapper[4923]: I0224 03:00:47.961714 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6c8kn" Feb 24 03:00:48 crc kubenswrapper[4923]: I0224 03:00:48.136784 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6c8kn"] Feb 24 03:00:48 crc kubenswrapper[4923]: W0224 03:00:48.146850 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bfb8ad5_974b_4507_96cf_1150c1ca8937.slice/crio-52871955e96c5e25bf224be5e04db931a85ef896427c1d8cb742fe56ebcde421 WatchSource:0}: Error finding container 52871955e96c5e25bf224be5e04db931a85ef896427c1d8cb742fe56ebcde421: Status 404 returned error can't find the container with id 52871955e96c5e25bf224be5e04db931a85ef896427c1d8cb742fe56ebcde421 Feb 24 03:00:48 crc kubenswrapper[4923]: I0224 03:00:48.786222 4923 generic.go:334] "Generic (PLEG): container finished" podID="3ade27f6-4909-4b58-b29d-d7b74686d166" containerID="e5621b179154fc873c9d2e18ef8086ccdc6c98de1ff22b5ad8550fbf1637d127" exitCode=0 Feb 24 03:00:48 crc kubenswrapper[4923]: I0224 03:00:48.786360 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2s2b" event={"ID":"3ade27f6-4909-4b58-b29d-d7b74686d166","Type":"ContainerDied","Data":"e5621b179154fc873c9d2e18ef8086ccdc6c98de1ff22b5ad8550fbf1637d127"} Feb 24 03:00:48 crc kubenswrapper[4923]: I0224 03:00:48.786422 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2s2b" event={"ID":"3ade27f6-4909-4b58-b29d-d7b74686d166","Type":"ContainerStarted","Data":"035f3b24963de3069662946af5198a80e490bbf2284966f48a56433dac8fc492"} Feb 24 03:00:48 crc kubenswrapper[4923]: I0224 03:00:48.788616 4923 generic.go:334] "Generic (PLEG): container finished" podID="3bfb8ad5-974b-4507-96cf-1150c1ca8937" containerID="3d224d14c2c4df3cab9b3443c5fe8d9d30fe11c7bd7ee7af829355d752424daf" exitCode=0 Feb 24 03:00:48 crc kubenswrapper[4923]: I0224 03:00:48.788669 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6c8kn" event={"ID":"3bfb8ad5-974b-4507-96cf-1150c1ca8937","Type":"ContainerDied","Data":"3d224d14c2c4df3cab9b3443c5fe8d9d30fe11c7bd7ee7af829355d752424daf"} Feb 24 03:00:48 crc kubenswrapper[4923]: I0224 03:00:48.788720 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6c8kn" event={"ID":"3bfb8ad5-974b-4507-96cf-1150c1ca8937","Type":"ContainerStarted","Data":"52871955e96c5e25bf224be5e04db931a85ef896427c1d8cb742fe56ebcde421"} Feb 24 03:00:49 crc kubenswrapper[4923]: I0224 03:00:49.798097 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2s2b" event={"ID":"3ade27f6-4909-4b58-b29d-d7b74686d166","Type":"ContainerStarted","Data":"2a47a755c8b9504b5f11c8481e94d35ac8ae4c5da33d072111a4ef2dadd9a7b3"} Feb 24 03:00:49 crc kubenswrapper[4923]: I0224 03:00:49.837451 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bgg4b"] Feb 24 03:00:49 crc kubenswrapper[4923]: I0224 03:00:49.838718 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bgg4b" Feb 24 03:00:49 crc kubenswrapper[4923]: I0224 03:00:49.841582 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 24 03:00:49 crc kubenswrapper[4923]: I0224 03:00:49.847248 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bgg4b"] Feb 24 03:00:49 crc kubenswrapper[4923]: I0224 03:00:49.998151 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rvqn\" (UniqueName: \"kubernetes.io/projected/1ddcf8ff-1207-46ff-9dde-08219d670309-kube-api-access-6rvqn\") pod \"community-operators-bgg4b\" (UID: \"1ddcf8ff-1207-46ff-9dde-08219d670309\") " pod="openshift-marketplace/community-operators-bgg4b" Feb 24 03:00:49 crc kubenswrapper[4923]: I0224 03:00:49.998213 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ddcf8ff-1207-46ff-9dde-08219d670309-catalog-content\") pod \"community-operators-bgg4b\" (UID: \"1ddcf8ff-1207-46ff-9dde-08219d670309\") " pod="openshift-marketplace/community-operators-bgg4b" Feb 24 03:00:49 crc kubenswrapper[4923]: I0224 03:00:49.998254 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ddcf8ff-1207-46ff-9dde-08219d670309-utilities\") pod \"community-operators-bgg4b\" (UID: \"1ddcf8ff-1207-46ff-9dde-08219d670309\") " pod="openshift-marketplace/community-operators-bgg4b" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.041960 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t8gng"] Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.043233 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8gng" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.044946 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.047053 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t8gng"] Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.099942 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rvqn\" (UniqueName: \"kubernetes.io/projected/1ddcf8ff-1207-46ff-9dde-08219d670309-kube-api-access-6rvqn\") pod \"community-operators-bgg4b\" (UID: \"1ddcf8ff-1207-46ff-9dde-08219d670309\") " pod="openshift-marketplace/community-operators-bgg4b" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.099998 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ddcf8ff-1207-46ff-9dde-08219d670309-catalog-content\") pod \"community-operators-bgg4b\" (UID: \"1ddcf8ff-1207-46ff-9dde-08219d670309\") " pod="openshift-marketplace/community-operators-bgg4b" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.100034 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ddcf8ff-1207-46ff-9dde-08219d670309-utilities\") pod \"community-operators-bgg4b\" (UID: \"1ddcf8ff-1207-46ff-9dde-08219d670309\") " pod="openshift-marketplace/community-operators-bgg4b" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.100449 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ddcf8ff-1207-46ff-9dde-08219d670309-utilities\") pod \"community-operators-bgg4b\" (UID: \"1ddcf8ff-1207-46ff-9dde-08219d670309\") " pod="openshift-marketplace/community-operators-bgg4b" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.100764 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ddcf8ff-1207-46ff-9dde-08219d670309-catalog-content\") pod \"community-operators-bgg4b\" (UID: \"1ddcf8ff-1207-46ff-9dde-08219d670309\") " pod="openshift-marketplace/community-operators-bgg4b" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.120577 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rvqn\" (UniqueName: \"kubernetes.io/projected/1ddcf8ff-1207-46ff-9dde-08219d670309-kube-api-access-6rvqn\") pod \"community-operators-bgg4b\" (UID: \"1ddcf8ff-1207-46ff-9dde-08219d670309\") " pod="openshift-marketplace/community-operators-bgg4b" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.157607 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bgg4b" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.201131 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d03834e-9594-476a-a7a9-1bf1aa9ade01-utilities\") pod \"certified-operators-t8gng\" (UID: \"5d03834e-9594-476a-a7a9-1bf1aa9ade01\") " pod="openshift-marketplace/certified-operators-t8gng" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.201390 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d03834e-9594-476a-a7a9-1bf1aa9ade01-catalog-content\") pod \"certified-operators-t8gng\" (UID: \"5d03834e-9594-476a-a7a9-1bf1aa9ade01\") " pod="openshift-marketplace/certified-operators-t8gng" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.201560 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6nnc\" (UniqueName: \"kubernetes.io/projected/5d03834e-9594-476a-a7a9-1bf1aa9ade01-kube-api-access-w6nnc\") pod \"certified-operators-t8gng\" (UID: \"5d03834e-9594-476a-a7a9-1bf1aa9ade01\") " pod="openshift-marketplace/certified-operators-t8gng" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.303357 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d03834e-9594-476a-a7a9-1bf1aa9ade01-utilities\") pod \"certified-operators-t8gng\" (UID: \"5d03834e-9594-476a-a7a9-1bf1aa9ade01\") " pod="openshift-marketplace/certified-operators-t8gng" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.303725 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d03834e-9594-476a-a7a9-1bf1aa9ade01-catalog-content\") pod \"certified-operators-t8gng\" (UID: \"5d03834e-9594-476a-a7a9-1bf1aa9ade01\") " pod="openshift-marketplace/certified-operators-t8gng" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.303760 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6nnc\" (UniqueName: \"kubernetes.io/projected/5d03834e-9594-476a-a7a9-1bf1aa9ade01-kube-api-access-w6nnc\") pod \"certified-operators-t8gng\" (UID: \"5d03834e-9594-476a-a7a9-1bf1aa9ade01\") " pod="openshift-marketplace/certified-operators-t8gng" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.304812 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d03834e-9594-476a-a7a9-1bf1aa9ade01-utilities\") pod \"certified-operators-t8gng\" (UID: \"5d03834e-9594-476a-a7a9-1bf1aa9ade01\") " pod="openshift-marketplace/certified-operators-t8gng" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.304905 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d03834e-9594-476a-a7a9-1bf1aa9ade01-catalog-content\") pod \"certified-operators-t8gng\" (UID: \"5d03834e-9594-476a-a7a9-1bf1aa9ade01\") " pod="openshift-marketplace/certified-operators-t8gng" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.325662 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6nnc\" (UniqueName: \"kubernetes.io/projected/5d03834e-9594-476a-a7a9-1bf1aa9ade01-kube-api-access-w6nnc\") pod \"certified-operators-t8gng\" (UID: \"5d03834e-9594-476a-a7a9-1bf1aa9ade01\") " pod="openshift-marketplace/certified-operators-t8gng" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.369122 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8gng" Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.534439 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t8gng"] Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.556971 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bgg4b"] Feb 24 03:00:50 crc kubenswrapper[4923]: W0224 03:00:50.572655 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ddcf8ff_1207_46ff_9dde_08219d670309.slice/crio-6e9a7121224f981da5129adeabbd2cc7c64065138a5cfeea7bab88f275f2d040 WatchSource:0}: Error finding container 6e9a7121224f981da5129adeabbd2cc7c64065138a5cfeea7bab88f275f2d040: Status 404 returned error can't find the container with id 6e9a7121224f981da5129adeabbd2cc7c64065138a5cfeea7bab88f275f2d040 Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.804210 4923 generic.go:334] "Generic (PLEG): container finished" podID="1ddcf8ff-1207-46ff-9dde-08219d670309" containerID="b63781dbc8ca83f9fe61d900dc8792956ffd25d7712e8bb91d192216e80ff773" exitCode=0 Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.804272 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgg4b" event={"ID":"1ddcf8ff-1207-46ff-9dde-08219d670309","Type":"ContainerDied","Data":"b63781dbc8ca83f9fe61d900dc8792956ffd25d7712e8bb91d192216e80ff773"} Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.804566 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgg4b" event={"ID":"1ddcf8ff-1207-46ff-9dde-08219d670309","Type":"ContainerStarted","Data":"6e9a7121224f981da5129adeabbd2cc7c64065138a5cfeea7bab88f275f2d040"} Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.806967 4923 generic.go:334] "Generic (PLEG): container finished" podID="3bfb8ad5-974b-4507-96cf-1150c1ca8937" containerID="ad55a6e47fb8dce5df6bad2b2eb09ca6bfea9e89d5d80281e4ccc247c498c53b" exitCode=0 Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.806990 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6c8kn" event={"ID":"3bfb8ad5-974b-4507-96cf-1150c1ca8937","Type":"ContainerDied","Data":"ad55a6e47fb8dce5df6bad2b2eb09ca6bfea9e89d5d80281e4ccc247c498c53b"} Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.821066 4923 generic.go:334] "Generic (PLEG): container finished" podID="3ade27f6-4909-4b58-b29d-d7b74686d166" containerID="2a47a755c8b9504b5f11c8481e94d35ac8ae4c5da33d072111a4ef2dadd9a7b3" exitCode=0 Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.821656 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2s2b" event={"ID":"3ade27f6-4909-4b58-b29d-d7b74686d166","Type":"ContainerDied","Data":"2a47a755c8b9504b5f11c8481e94d35ac8ae4c5da33d072111a4ef2dadd9a7b3"} Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.830899 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8gng" event={"ID":"5d03834e-9594-476a-a7a9-1bf1aa9ade01","Type":"ContainerStarted","Data":"ad1de1de957b85c14116f8abd4d4bdcd67fd968a01a17e0fbd77560b6c13c0f9"} Feb 24 03:00:50 crc kubenswrapper[4923]: I0224 03:00:50.830939 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8gng" event={"ID":"5d03834e-9594-476a-a7a9-1bf1aa9ade01","Type":"ContainerStarted","Data":"4be42949b73c9e1d713e934022424b94c73a4519c66ba6df8b19fc0ea504b8ab"} Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.157329 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" podUID="7227def5-b373-488f-9f56-4b6ed170751d" containerName="registry" containerID="cri-o://e4d4b5b7e93287205991690373a4db011b8c5168cbd8fc0847be41f5b249e835" gracePeriod=30 Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.611657 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.733844 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7227def5-b373-488f-9f56-4b6ed170751d-ca-trust-extracted\") pod \"7227def5-b373-488f-9f56-4b6ed170751d\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.733901 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7227def5-b373-488f-9f56-4b6ed170751d-bound-sa-token\") pod \"7227def5-b373-488f-9f56-4b6ed170751d\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.733944 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7227def5-b373-488f-9f56-4b6ed170751d-trusted-ca\") pod \"7227def5-b373-488f-9f56-4b6ed170751d\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.733976 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7227def5-b373-488f-9f56-4b6ed170751d-installation-pull-secrets\") pod \"7227def5-b373-488f-9f56-4b6ed170751d\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.734224 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7227def5-b373-488f-9f56-4b6ed170751d\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.734273 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c9ck\" (UniqueName: \"kubernetes.io/projected/7227def5-b373-488f-9f56-4b6ed170751d-kube-api-access-2c9ck\") pod \"7227def5-b373-488f-9f56-4b6ed170751d\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.734355 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7227def5-b373-488f-9f56-4b6ed170751d-registry-certificates\") pod \"7227def5-b373-488f-9f56-4b6ed170751d\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.734385 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7227def5-b373-488f-9f56-4b6ed170751d-registry-tls\") pod \"7227def5-b373-488f-9f56-4b6ed170751d\" (UID: \"7227def5-b373-488f-9f56-4b6ed170751d\") " Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.735066 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7227def5-b373-488f-9f56-4b6ed170751d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7227def5-b373-488f-9f56-4b6ed170751d" (UID: "7227def5-b373-488f-9f56-4b6ed170751d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.735714 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7227def5-b373-488f-9f56-4b6ed170751d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7227def5-b373-488f-9f56-4b6ed170751d" (UID: "7227def5-b373-488f-9f56-4b6ed170751d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.742794 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7227def5-b373-488f-9f56-4b6ed170751d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7227def5-b373-488f-9f56-4b6ed170751d" (UID: "7227def5-b373-488f-9f56-4b6ed170751d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.742818 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7227def5-b373-488f-9f56-4b6ed170751d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7227def5-b373-488f-9f56-4b6ed170751d" (UID: "7227def5-b373-488f-9f56-4b6ed170751d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.743555 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7227def5-b373-488f-9f56-4b6ed170751d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7227def5-b373-488f-9f56-4b6ed170751d" (UID: "7227def5-b373-488f-9f56-4b6ed170751d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.744155 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7227def5-b373-488f-9f56-4b6ed170751d-kube-api-access-2c9ck" (OuterVolumeSpecName: "kube-api-access-2c9ck") pod "7227def5-b373-488f-9f56-4b6ed170751d" (UID: "7227def5-b373-488f-9f56-4b6ed170751d"). InnerVolumeSpecName "kube-api-access-2c9ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.765592 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7227def5-b373-488f-9f56-4b6ed170751d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7227def5-b373-488f-9f56-4b6ed170751d" (UID: "7227def5-b373-488f-9f56-4b6ed170751d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.769690 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7227def5-b373-488f-9f56-4b6ed170751d" (UID: "7227def5-b373-488f-9f56-4b6ed170751d"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.835311 4923 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7227def5-b373-488f-9f56-4b6ed170751d-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.835338 4923 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7227def5-b373-488f-9f56-4b6ed170751d-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.835349 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c9ck\" (UniqueName: \"kubernetes.io/projected/7227def5-b373-488f-9f56-4b6ed170751d-kube-api-access-2c9ck\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.835359 4923 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7227def5-b373-488f-9f56-4b6ed170751d-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.835367 4923 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7227def5-b373-488f-9f56-4b6ed170751d-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.835377 4923 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7227def5-b373-488f-9f56-4b6ed170751d-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.835386 4923 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7227def5-b373-488f-9f56-4b6ed170751d-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.837525 4923 generic.go:334] "Generic (PLEG): container finished" podID="7227def5-b373-488f-9f56-4b6ed170751d" containerID="e4d4b5b7e93287205991690373a4db011b8c5168cbd8fc0847be41f5b249e835" exitCode=0 Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.837597 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.837603 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" event={"ID":"7227def5-b373-488f-9f56-4b6ed170751d","Type":"ContainerDied","Data":"e4d4b5b7e93287205991690373a4db011b8c5168cbd8fc0847be41f5b249e835"} Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.837636 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-95gv5" event={"ID":"7227def5-b373-488f-9f56-4b6ed170751d","Type":"ContainerDied","Data":"f6ea8b0bba77d4eb42b7e9e465d87d18d8bc41669adafd50d2229f607ea7f8df"} Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.837656 4923 scope.go:117] "RemoveContainer" containerID="e4d4b5b7e93287205991690373a4db011b8c5168cbd8fc0847be41f5b249e835" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.840163 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2s2b" event={"ID":"3ade27f6-4909-4b58-b29d-d7b74686d166","Type":"ContainerStarted","Data":"b65e36723e057b0dea4a9de6aa0ce5fa9bb44c58c5ae409a1a2a4bb995659dc9"} Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.843412 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6c8kn" event={"ID":"3bfb8ad5-974b-4507-96cf-1150c1ca8937","Type":"ContainerStarted","Data":"d669d2d750342f664d2e023504ebbf2f8e1e11cb455a35e7fe7d8db2d2557b37"} Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.848434 4923 generic.go:334] "Generic (PLEG): container finished" podID="5d03834e-9594-476a-a7a9-1bf1aa9ade01" containerID="ad1de1de957b85c14116f8abd4d4bdcd67fd968a01a17e0fbd77560b6c13c0f9" exitCode=0 Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.848504 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8gng" event={"ID":"5d03834e-9594-476a-a7a9-1bf1aa9ade01","Type":"ContainerDied","Data":"ad1de1de957b85c14116f8abd4d4bdcd67fd968a01a17e0fbd77560b6c13c0f9"} Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.857751 4923 scope.go:117] "RemoveContainer" containerID="e4d4b5b7e93287205991690373a4db011b8c5168cbd8fc0847be41f5b249e835" Feb 24 03:00:51 crc kubenswrapper[4923]: E0224 03:00:51.858406 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4d4b5b7e93287205991690373a4db011b8c5168cbd8fc0847be41f5b249e835\": container with ID starting with e4d4b5b7e93287205991690373a4db011b8c5168cbd8fc0847be41f5b249e835 not found: ID does not exist" containerID="e4d4b5b7e93287205991690373a4db011b8c5168cbd8fc0847be41f5b249e835" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.858454 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4d4b5b7e93287205991690373a4db011b8c5168cbd8fc0847be41f5b249e835"} err="failed to get container status \"e4d4b5b7e93287205991690373a4db011b8c5168cbd8fc0847be41f5b249e835\": rpc error: code = NotFound desc = could not find container \"e4d4b5b7e93287205991690373a4db011b8c5168cbd8fc0847be41f5b249e835\": container with ID starting with e4d4b5b7e93287205991690373a4db011b8c5168cbd8fc0847be41f5b249e835 not found: ID does not exist" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.859427 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s2s2b" podStartSLOduration=2.135894393 podStartE2EDuration="4.85940701s" podCreationTimestamp="2026-02-24 03:00:47 +0000 UTC" firstStartedPulling="2026-02-24 03:00:48.787807753 +0000 UTC m=+372.804878576" lastFinishedPulling="2026-02-24 03:00:51.51132038 +0000 UTC m=+375.528391193" observedRunningTime="2026-02-24 03:00:51.855101365 +0000 UTC m=+375.872172178" watchObservedRunningTime="2026-02-24 03:00:51.85940701 +0000 UTC m=+375.876477823" Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.871634 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-95gv5"] Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.878028 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-95gv5"] Feb 24 03:00:51 crc kubenswrapper[4923]: I0224 03:00:51.895745 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6c8kn" podStartSLOduration=2.374111989 podStartE2EDuration="4.895726816s" podCreationTimestamp="2026-02-24 03:00:47 +0000 UTC" firstStartedPulling="2026-02-24 03:00:48.791129791 +0000 UTC m=+372.808200604" lastFinishedPulling="2026-02-24 03:00:51.312744618 +0000 UTC m=+375.329815431" observedRunningTime="2026-02-24 03:00:51.892544601 +0000 UTC m=+375.909615424" watchObservedRunningTime="2026-02-24 03:00:51.895726816 +0000 UTC m=+375.912797629" Feb 24 03:00:52 crc kubenswrapper[4923]: I0224 03:00:52.855022 4923 generic.go:334] "Generic (PLEG): container finished" podID="1ddcf8ff-1207-46ff-9dde-08219d670309" containerID="9f14f98fc299eb3729aa78878bf10e07dcf95ce94756c969c1f0b26c4d1cccad" exitCode=0 Feb 24 03:00:52 crc kubenswrapper[4923]: I0224 03:00:52.855120 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgg4b" event={"ID":"1ddcf8ff-1207-46ff-9dde-08219d670309","Type":"ContainerDied","Data":"9f14f98fc299eb3729aa78878bf10e07dcf95ce94756c969c1f0b26c4d1cccad"} Feb 24 03:00:52 crc kubenswrapper[4923]: I0224 03:00:52.858658 4923 generic.go:334] "Generic (PLEG): container finished" podID="5d03834e-9594-476a-a7a9-1bf1aa9ade01" containerID="dadce8e7207f74916350f5fd6b1207ed4b73d89ca57d23b183ddb5fe072945a4" exitCode=0 Feb 24 03:00:52 crc kubenswrapper[4923]: I0224 03:00:52.858724 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8gng" event={"ID":"5d03834e-9594-476a-a7a9-1bf1aa9ade01","Type":"ContainerDied","Data":"dadce8e7207f74916350f5fd6b1207ed4b73d89ca57d23b183ddb5fe072945a4"} Feb 24 03:00:53 crc kubenswrapper[4923]: I0224 03:00:53.720262 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7227def5-b373-488f-9f56-4b6ed170751d" path="/var/lib/kubelet/pods/7227def5-b373-488f-9f56-4b6ed170751d/volumes" Feb 24 03:00:53 crc kubenswrapper[4923]: I0224 03:00:53.870242 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8gng" event={"ID":"5d03834e-9594-476a-a7a9-1bf1aa9ade01","Type":"ContainerStarted","Data":"dc7d1567a1893ebcd6c4f4a2dfcec96da1a8133065a24c6a9b4b053a325382c0"} Feb 24 03:00:53 crc kubenswrapper[4923]: I0224 03:00:53.873341 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgg4b" event={"ID":"1ddcf8ff-1207-46ff-9dde-08219d670309","Type":"ContainerStarted","Data":"ee639c688d9cca0120941dfed643962dd9b85767fcdc04d80316b36f3de64fe1"} Feb 24 03:00:53 crc kubenswrapper[4923]: I0224 03:00:53.892975 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t8gng" podStartSLOduration=1.391580116 podStartE2EDuration="3.892957664s" podCreationTimestamp="2026-02-24 03:00:50 +0000 UTC" firstStartedPulling="2026-02-24 03:00:50.833361626 +0000 UTC m=+374.850432449" lastFinishedPulling="2026-02-24 03:00:53.334739184 +0000 UTC m=+377.351809997" observedRunningTime="2026-02-24 03:00:53.890113528 +0000 UTC m=+377.907184341" watchObservedRunningTime="2026-02-24 03:00:53.892957664 +0000 UTC m=+377.910028467" Feb 24 03:00:53 crc kubenswrapper[4923]: I0224 03:00:53.911747 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bgg4b" podStartSLOduration=2.4899303919999998 podStartE2EDuration="4.911728813s" podCreationTimestamp="2026-02-24 03:00:49 +0000 UTC" firstStartedPulling="2026-02-24 03:00:50.805654529 +0000 UTC m=+374.822725352" lastFinishedPulling="2026-02-24 03:00:53.22745296 +0000 UTC m=+377.244523773" observedRunningTime="2026-02-24 03:00:53.90748872 +0000 UTC m=+377.924559533" watchObservedRunningTime="2026-02-24 03:00:53.911728813 +0000 UTC m=+377.928799636" Feb 24 03:00:57 crc kubenswrapper[4923]: I0224 03:00:57.751531 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s2s2b" Feb 24 03:00:57 crc kubenswrapper[4923]: I0224 03:00:57.751873 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s2s2b" Feb 24 03:00:57 crc kubenswrapper[4923]: I0224 03:00:57.798935 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s2s2b" Feb 24 03:00:57 crc kubenswrapper[4923]: I0224 03:00:57.931533 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s2s2b" Feb 24 03:00:57 crc kubenswrapper[4923]: I0224 03:00:57.962424 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6c8kn" Feb 24 03:00:57 crc kubenswrapper[4923]: I0224 03:00:57.962543 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6c8kn" Feb 24 03:00:58 crc kubenswrapper[4923]: I0224 03:00:58.005771 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6c8kn" Feb 24 03:00:58 crc kubenswrapper[4923]: I0224 03:00:58.946898 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6c8kn" Feb 24 03:01:00 crc kubenswrapper[4923]: I0224 03:01:00.157877 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bgg4b" Feb 24 03:01:00 crc kubenswrapper[4923]: I0224 03:01:00.158851 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bgg4b" Feb 24 03:01:00 crc kubenswrapper[4923]: I0224 03:01:00.216179 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bgg4b" Feb 24 03:01:00 crc kubenswrapper[4923]: I0224 03:01:00.370046 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t8gng" Feb 24 03:01:00 crc kubenswrapper[4923]: I0224 03:01:00.370485 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t8gng" Feb 24 03:01:00 crc kubenswrapper[4923]: I0224 03:01:00.409334 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t8gng" Feb 24 03:01:00 crc kubenswrapper[4923]: I0224 03:01:00.949793 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t8gng" Feb 24 03:01:00 crc kubenswrapper[4923]: I0224 03:01:00.957221 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bgg4b" Feb 24 03:02:19 crc kubenswrapper[4923]: I0224 03:02:19.916940 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:02:19 crc kubenswrapper[4923]: I0224 03:02:19.917449 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:02:49 crc kubenswrapper[4923]: I0224 03:02:49.916565 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:02:49 crc kubenswrapper[4923]: I0224 03:02:49.917173 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:03:19 crc kubenswrapper[4923]: I0224 03:03:19.916833 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:03:19 crc kubenswrapper[4923]: I0224 03:03:19.918839 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:03:19 crc kubenswrapper[4923]: I0224 03:03:19.919049 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 03:03:19 crc kubenswrapper[4923]: I0224 03:03:19.920020 4923 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e2c4fc96d859c5f960586857b5a88ab66b5662816e6463364ed18c251990f0e2"} pod="openshift-machine-config-operator/machine-config-daemon-rh26t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 03:03:19 crc kubenswrapper[4923]: I0224 03:03:19.920274 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" containerID="cri-o://e2c4fc96d859c5f960586857b5a88ab66b5662816e6463364ed18c251990f0e2" gracePeriod=600 Feb 24 03:03:20 crc kubenswrapper[4923]: I0224 03:03:20.744974 4923 generic.go:334] "Generic (PLEG): container finished" podID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerID="e2c4fc96d859c5f960586857b5a88ab66b5662816e6463364ed18c251990f0e2" exitCode=0 Feb 24 03:03:20 crc kubenswrapper[4923]: I0224 03:03:20.745044 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerDied","Data":"e2c4fc96d859c5f960586857b5a88ab66b5662816e6463364ed18c251990f0e2"} Feb 24 03:03:20 crc kubenswrapper[4923]: I0224 03:03:20.745318 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerStarted","Data":"dd9566916d0707d0d74e9c129e61b45fcf01bae77c86a9c663fcebb809b372a3"} Feb 24 03:03:20 crc kubenswrapper[4923]: I0224 03:03:20.745339 4923 scope.go:117] "RemoveContainer" containerID="92a19cc205b64e61b6c65d1f93e5df48760062306a031253913e5f685cebe0c6" Feb 24 03:03:39 crc kubenswrapper[4923]: I0224 03:03:39.845898 4923 scope.go:117] "RemoveContainer" containerID="398dacdccef17fa0cc364a118005253b3f1d85e6bf83e1a9af616e5aa4937b03" Feb 24 03:03:39 crc kubenswrapper[4923]: I0224 03:03:39.874165 4923 scope.go:117] "RemoveContainer" containerID="41abb1a86e2ea27a699e3de57c2edc784a3ecc77c3bd6cce545ba4c1ecc6230c" Feb 24 03:03:39 crc kubenswrapper[4923]: I0224 03:03:39.906172 4923 scope.go:117] "RemoveContainer" containerID="243cc7fd0c396edcdffac2275a32713f681c13e48e5631e722100ede792e7cd2" Feb 24 03:05:49 crc kubenswrapper[4923]: I0224 03:05:49.916945 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:05:49 crc kubenswrapper[4923]: I0224 03:05:49.917624 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.407899 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-6qbrj"] Feb 24 03:06:19 crc kubenswrapper[4923]: E0224 03:06:19.408626 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7227def5-b373-488f-9f56-4b6ed170751d" containerName="registry" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.408638 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="7227def5-b373-488f-9f56-4b6ed170751d" containerName="registry" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.408722 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="7227def5-b373-488f-9f56-4b6ed170751d" containerName="registry" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.409111 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6qbrj" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.413276 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.413421 4923 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-ll9bg" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.423858 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-r9qzx"] Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.423952 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.424574 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-r9qzx" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.427523 4923 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-9pzps" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.433117 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-6qbrj"] Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.444342 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-pbnww"] Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.445067 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-pbnww" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.446424 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s6sv\" (UniqueName: \"kubernetes.io/projected/b6864f21-fdba-416a-b777-a492c9c9e66c-kube-api-access-6s6sv\") pod \"cert-manager-cainjector-cf98fcc89-6qbrj\" (UID: \"b6864f21-fdba-416a-b777-a492c9c9e66c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-6qbrj" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.447351 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-r9qzx"] Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.447630 4923 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-8cr5t" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.465595 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-pbnww"] Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.547763 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sk85\" (UniqueName: \"kubernetes.io/projected/60be919e-8301-45ed-9e67-6e54e6ddef7f-kube-api-access-4sk85\") pod \"cert-manager-webhook-687f57d79b-r9qzx\" (UID: \"60be919e-8301-45ed-9e67-6e54e6ddef7f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-r9qzx" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.547899 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crhqw\" (UniqueName: \"kubernetes.io/projected/779174d3-c69e-46be-b5e4-4210a6697e7b-kube-api-access-crhqw\") pod \"cert-manager-858654f9db-pbnww\" (UID: \"779174d3-c69e-46be-b5e4-4210a6697e7b\") " pod="cert-manager/cert-manager-858654f9db-pbnww" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.547998 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s6sv\" (UniqueName: \"kubernetes.io/projected/b6864f21-fdba-416a-b777-a492c9c9e66c-kube-api-access-6s6sv\") pod \"cert-manager-cainjector-cf98fcc89-6qbrj\" (UID: \"b6864f21-fdba-416a-b777-a492c9c9e66c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-6qbrj" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.568002 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s6sv\" (UniqueName: \"kubernetes.io/projected/b6864f21-fdba-416a-b777-a492c9c9e66c-kube-api-access-6s6sv\") pod \"cert-manager-cainjector-cf98fcc89-6qbrj\" (UID: \"b6864f21-fdba-416a-b777-a492c9c9e66c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-6qbrj" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.649498 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sk85\" (UniqueName: \"kubernetes.io/projected/60be919e-8301-45ed-9e67-6e54e6ddef7f-kube-api-access-4sk85\") pod \"cert-manager-webhook-687f57d79b-r9qzx\" (UID: \"60be919e-8301-45ed-9e67-6e54e6ddef7f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-r9qzx" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.649547 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crhqw\" (UniqueName: \"kubernetes.io/projected/779174d3-c69e-46be-b5e4-4210a6697e7b-kube-api-access-crhqw\") pod \"cert-manager-858654f9db-pbnww\" (UID: \"779174d3-c69e-46be-b5e4-4210a6697e7b\") " pod="cert-manager/cert-manager-858654f9db-pbnww" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.666363 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sk85\" (UniqueName: \"kubernetes.io/projected/60be919e-8301-45ed-9e67-6e54e6ddef7f-kube-api-access-4sk85\") pod \"cert-manager-webhook-687f57d79b-r9qzx\" (UID: \"60be919e-8301-45ed-9e67-6e54e6ddef7f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-r9qzx" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.669341 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crhqw\" (UniqueName: \"kubernetes.io/projected/779174d3-c69e-46be-b5e4-4210a6697e7b-kube-api-access-crhqw\") pod \"cert-manager-858654f9db-pbnww\" (UID: \"779174d3-c69e-46be-b5e4-4210a6697e7b\") " pod="cert-manager/cert-manager-858654f9db-pbnww" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.724635 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6qbrj" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.748504 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-r9qzx" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.761008 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-pbnww" Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.916594 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:06:19 crc kubenswrapper[4923]: I0224 03:06:19.916937 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:06:20 crc kubenswrapper[4923]: I0224 03:06:20.146215 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-6qbrj"] Feb 24 03:06:20 crc kubenswrapper[4923]: I0224 03:06:20.153220 4923 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 03:06:20 crc kubenswrapper[4923]: I0224 03:06:20.203823 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-r9qzx"] Feb 24 03:06:20 crc kubenswrapper[4923]: I0224 03:06:20.206652 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-pbnww"] Feb 24 03:06:20 crc kubenswrapper[4923]: W0224 03:06:20.209808 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60be919e_8301_45ed_9e67_6e54e6ddef7f.slice/crio-10f401e99f9b4edc3ef0d7f0418a9b12d5e37e34842c60cd183fab8096531941 WatchSource:0}: Error finding container 10f401e99f9b4edc3ef0d7f0418a9b12d5e37e34842c60cd183fab8096531941: Status 404 returned error can't find the container with id 10f401e99f9b4edc3ef0d7f0418a9b12d5e37e34842c60cd183fab8096531941 Feb 24 03:06:20 crc kubenswrapper[4923]: W0224 03:06:20.211131 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod779174d3_c69e_46be_b5e4_4210a6697e7b.slice/crio-531fc626a73d4b4be428cdae98a69f49c04ccb30e858d9e908fa99f35f6dbb62 WatchSource:0}: Error finding container 531fc626a73d4b4be428cdae98a69f49c04ccb30e858d9e908fa99f35f6dbb62: Status 404 returned error can't find the container with id 531fc626a73d4b4be428cdae98a69f49c04ccb30e858d9e908fa99f35f6dbb62 Feb 24 03:06:20 crc kubenswrapper[4923]: I0224 03:06:20.938187 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-r9qzx" event={"ID":"60be919e-8301-45ed-9e67-6e54e6ddef7f","Type":"ContainerStarted","Data":"10f401e99f9b4edc3ef0d7f0418a9b12d5e37e34842c60cd183fab8096531941"} Feb 24 03:06:20 crc kubenswrapper[4923]: I0224 03:06:20.939253 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6qbrj" event={"ID":"b6864f21-fdba-416a-b777-a492c9c9e66c","Type":"ContainerStarted","Data":"f95ff17b8f31e72047ae420ffe22d6bf02895853d3284eb8287d6672e1b823f2"} Feb 24 03:06:20 crc kubenswrapper[4923]: I0224 03:06:20.940486 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-pbnww" event={"ID":"779174d3-c69e-46be-b5e4-4210a6697e7b","Type":"ContainerStarted","Data":"531fc626a73d4b4be428cdae98a69f49c04ccb30e858d9e908fa99f35f6dbb62"} Feb 24 03:06:24 crc kubenswrapper[4923]: I0224 03:06:24.964893 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-r9qzx" event={"ID":"60be919e-8301-45ed-9e67-6e54e6ddef7f","Type":"ContainerStarted","Data":"2636489a6a49bfa08feb93aa87c2353de928272197c48fbb2578b9077e166dee"} Feb 24 03:06:24 crc kubenswrapper[4923]: I0224 03:06:24.965498 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-r9qzx" Feb 24 03:06:24 crc kubenswrapper[4923]: I0224 03:06:24.967632 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6qbrj" event={"ID":"b6864f21-fdba-416a-b777-a492c9c9e66c","Type":"ContainerStarted","Data":"0bcc82672b9187ba758d9a21bdc176e5db5dd0f9f887760037193432dd3329fa"} Feb 24 03:06:24 crc kubenswrapper[4923]: I0224 03:06:24.970392 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-pbnww" event={"ID":"779174d3-c69e-46be-b5e4-4210a6697e7b","Type":"ContainerStarted","Data":"7631467f5a43620c1c47555ea81a4567f77a1afdc6de1a3304433f64c3fd48b4"} Feb 24 03:06:24 crc kubenswrapper[4923]: I0224 03:06:24.993615 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-r9qzx" podStartSLOduration=2.600158712 podStartE2EDuration="5.993581316s" podCreationTimestamp="2026-02-24 03:06:19 +0000 UTC" firstStartedPulling="2026-02-24 03:06:20.212262904 +0000 UTC m=+704.229333717" lastFinishedPulling="2026-02-24 03:06:23.605685518 +0000 UTC m=+707.622756321" observedRunningTime="2026-02-24 03:06:24.98848482 +0000 UTC m=+709.005555683" watchObservedRunningTime="2026-02-24 03:06:24.993581316 +0000 UTC m=+709.010652169" Feb 24 03:06:25 crc kubenswrapper[4923]: I0224 03:06:25.016206 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-pbnww" podStartSLOduration=2.668141258 podStartE2EDuration="6.01618057s" podCreationTimestamp="2026-02-24 03:06:19 +0000 UTC" firstStartedPulling="2026-02-24 03:06:20.214865283 +0000 UTC m=+704.231936096" lastFinishedPulling="2026-02-24 03:06:23.562904595 +0000 UTC m=+707.579975408" observedRunningTime="2026-02-24 03:06:25.009351987 +0000 UTC m=+709.026422840" watchObservedRunningTime="2026-02-24 03:06:25.01618057 +0000 UTC m=+709.033251423" Feb 24 03:06:25 crc kubenswrapper[4923]: I0224 03:06:25.047001 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6qbrj" podStartSLOduration=2.250997224 podStartE2EDuration="6.046962222s" podCreationTimestamp="2026-02-24 03:06:19 +0000 UTC" firstStartedPulling="2026-02-24 03:06:20.15297949 +0000 UTC m=+704.170050293" lastFinishedPulling="2026-02-24 03:06:23.948944488 +0000 UTC m=+707.966015291" observedRunningTime="2026-02-24 03:06:25.033535883 +0000 UTC m=+709.050606736" watchObservedRunningTime="2026-02-24 03:06:25.046962222 +0000 UTC m=+709.064033075" Feb 24 03:06:29 crc kubenswrapper[4923]: I0224 03:06:29.727235 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-42sfg"] Feb 24 03:06:29 crc kubenswrapper[4923]: I0224 03:06:29.728128 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="ovn-controller" containerID="cri-o://83f7a60316410df3d5d5e554238f282a294d98d0a3801e5d4a1fd983d3a778a1" gracePeriod=30 Feb 24 03:06:29 crc kubenswrapper[4923]: I0224 03:06:29.728587 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="sbdb" containerID="cri-o://aa26ad1aff7dedd659bb4fc9764cebae5f1aadd4c2651488e35d7342480a46d8" gracePeriod=30 Feb 24 03:06:29 crc kubenswrapper[4923]: I0224 03:06:29.728671 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="kube-rbac-proxy-node" containerID="cri-o://bfb689c9f3fdd896e5b3f0cae8dfad3202a2fdace3d62893ab7998a2e8a89c9e" gracePeriod=30 Feb 24 03:06:29 crc kubenswrapper[4923]: I0224 03:06:29.728728 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="ovn-acl-logging" containerID="cri-o://96ba114882b2a7940916954fd6b959b6aea49bb56099cb8754a4acda08ee1f4a" gracePeriod=30 Feb 24 03:06:29 crc kubenswrapper[4923]: I0224 03:06:29.728760 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="nbdb" containerID="cri-o://0762e472a6d74f5219ca62ad13723c38f97f1ba9f52157ecf228d84623eaf3d5" gracePeriod=30 Feb 24 03:06:29 crc kubenswrapper[4923]: I0224 03:06:29.728858 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="northd" containerID="cri-o://55d1c59325bdc8422882f9054db3a2a544f0676067e2d657be7563a9d41a8e5d" gracePeriod=30 Feb 24 03:06:29 crc kubenswrapper[4923]: I0224 03:06:29.728696 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e8fcb85b60d31246ae99915253a8e7f54f97f112576ce0f72298af111bfd8913" gracePeriod=30 Feb 24 03:06:29 crc kubenswrapper[4923]: I0224 03:06:29.765118 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-r9qzx" Feb 24 03:06:29 crc kubenswrapper[4923]: I0224 03:06:29.810520 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="ovnkube-controller" containerID="cri-o://7504f1c88847e620e449f4f4a803c9f641abb29c95bfcb5ff4e88a7c5135f113" gracePeriod=30 Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.007133 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42sfg_4607f544-e6b3-4188-9b33-c638dfb1bda4/ovn-acl-logging/0.log" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.007856 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42sfg_4607f544-e6b3-4188-9b33-c638dfb1bda4/ovn-controller/0.log" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.008149 4923 generic.go:334] "Generic (PLEG): container finished" podID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerID="7504f1c88847e620e449f4f4a803c9f641abb29c95bfcb5ff4e88a7c5135f113" exitCode=0 Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.008173 4923 generic.go:334] "Generic (PLEG): container finished" podID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerID="aa26ad1aff7dedd659bb4fc9764cebae5f1aadd4c2651488e35d7342480a46d8" exitCode=0 Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.008180 4923 generic.go:334] "Generic (PLEG): container finished" podID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerID="0762e472a6d74f5219ca62ad13723c38f97f1ba9f52157ecf228d84623eaf3d5" exitCode=0 Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.008187 4923 generic.go:334] "Generic (PLEG): container finished" podID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerID="55d1c59325bdc8422882f9054db3a2a544f0676067e2d657be7563a9d41a8e5d" exitCode=0 Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.008195 4923 generic.go:334] "Generic (PLEG): container finished" podID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerID="e8fcb85b60d31246ae99915253a8e7f54f97f112576ce0f72298af111bfd8913" exitCode=0 Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.008203 4923 generic.go:334] "Generic (PLEG): container finished" podID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerID="bfb689c9f3fdd896e5b3f0cae8dfad3202a2fdace3d62893ab7998a2e8a89c9e" exitCode=0 Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.008210 4923 generic.go:334] "Generic (PLEG): container finished" podID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerID="96ba114882b2a7940916954fd6b959b6aea49bb56099cb8754a4acda08ee1f4a" exitCode=143 Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.008218 4923 generic.go:334] "Generic (PLEG): container finished" podID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerID="83f7a60316410df3d5d5e554238f282a294d98d0a3801e5d4a1fd983d3a778a1" exitCode=143 Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.008249 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerDied","Data":"7504f1c88847e620e449f4f4a803c9f641abb29c95bfcb5ff4e88a7c5135f113"} Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.008272 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerDied","Data":"aa26ad1aff7dedd659bb4fc9764cebae5f1aadd4c2651488e35d7342480a46d8"} Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.008281 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerDied","Data":"0762e472a6d74f5219ca62ad13723c38f97f1ba9f52157ecf228d84623eaf3d5"} Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.008289 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerDied","Data":"55d1c59325bdc8422882f9054db3a2a544f0676067e2d657be7563a9d41a8e5d"} Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.008312 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerDied","Data":"e8fcb85b60d31246ae99915253a8e7f54f97f112576ce0f72298af111bfd8913"} Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.008320 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerDied","Data":"bfb689c9f3fdd896e5b3f0cae8dfad3202a2fdace3d62893ab7998a2e8a89c9e"} Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.008329 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerDied","Data":"96ba114882b2a7940916954fd6b959b6aea49bb56099cb8754a4acda08ee1f4a"} Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.008337 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerDied","Data":"83f7a60316410df3d5d5e554238f282a294d98d0a3801e5d4a1fd983d3a778a1"} Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.008346 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" event={"ID":"4607f544-e6b3-4188-9b33-c638dfb1bda4","Type":"ContainerDied","Data":"65aca8e9e8534a2b1ab71856f9bba823f8c26742e718111f12b0252c55de15b6"} Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.008356 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65aca8e9e8534a2b1ab71856f9bba823f8c26742e718111f12b0252c55de15b6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.010762 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c74d9_062c912d-c8a5-4312-b691-dc6488667f7d/kube-multus/0.log" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.010790 4923 generic.go:334] "Generic (PLEG): container finished" podID="062c912d-c8a5-4312-b691-dc6488667f7d" containerID="6195211d6b7a3b2c1cb8715f8b02764e7dff4105d615755005c85b97f1785a08" exitCode=2 Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.010806 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c74d9" event={"ID":"062c912d-c8a5-4312-b691-dc6488667f7d","Type":"ContainerDied","Data":"6195211d6b7a3b2c1cb8715f8b02764e7dff4105d615755005c85b97f1785a08"} Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.011175 4923 scope.go:117] "RemoveContainer" containerID="6195211d6b7a3b2c1cb8715f8b02764e7dff4105d615755005c85b97f1785a08" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.084403 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42sfg_4607f544-e6b3-4188-9b33-c638dfb1bda4/ovn-acl-logging/0.log" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.084900 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42sfg_4607f544-e6b3-4188-9b33-c638dfb1bda4/ovn-controller/0.log" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.085365 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.147681 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x87r6"] Feb 24 03:06:30 crc kubenswrapper[4923]: E0224 03:06:30.147983 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="ovnkube-controller" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.148010 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="ovnkube-controller" Feb 24 03:06:30 crc kubenswrapper[4923]: E0224 03:06:30.148043 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="northd" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.148058 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="northd" Feb 24 03:06:30 crc kubenswrapper[4923]: E0224 03:06:30.148074 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="ovn-acl-logging" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.148087 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="ovn-acl-logging" Feb 24 03:06:30 crc kubenswrapper[4923]: E0224 03:06:30.148109 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="ovn-controller" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.148121 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="ovn-controller" Feb 24 03:06:30 crc kubenswrapper[4923]: E0224 03:06:30.148137 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="nbdb" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.148150 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="nbdb" Feb 24 03:06:30 crc kubenswrapper[4923]: E0224 03:06:30.148164 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="kubecfg-setup" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.148177 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="kubecfg-setup" Feb 24 03:06:30 crc kubenswrapper[4923]: E0224 03:06:30.148195 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="sbdb" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.148207 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="sbdb" Feb 24 03:06:30 crc kubenswrapper[4923]: E0224 03:06:30.148223 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="kube-rbac-proxy-ovn-metrics" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.148236 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="kube-rbac-proxy-ovn-metrics" Feb 24 03:06:30 crc kubenswrapper[4923]: E0224 03:06:30.148250 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="kube-rbac-proxy-node" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.148262 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="kube-rbac-proxy-node" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.148449 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="ovnkube-controller" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.148470 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="ovn-controller" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.148484 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="ovn-acl-logging" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.148505 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="sbdb" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.148521 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="kube-rbac-proxy-node" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.148538 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="northd" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.148556 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="kube-rbac-proxy-ovn-metrics" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.148576 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" containerName="nbdb" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.152539 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.195570 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-systemd-units\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.195640 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovnkube-config\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.195682 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-var-lib-openvswitch\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.195708 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-etc-openvswitch\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.195734 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tkvl\" (UniqueName: \"kubernetes.io/projected/4607f544-e6b3-4188-9b33-c638dfb1bda4-kube-api-access-2tkvl\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.195736 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.195759 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.195805 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.195839 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.195878 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-kubelet\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.195909 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-run-netns\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.195938 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-run-ovn\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.195960 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-log-socket\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.195976 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.195990 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-run-ovn-kubernetes\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.195995 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196007 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-log-socket" (OuterVolumeSpecName: "log-socket") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196020 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-cni-bin\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196033 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196033 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196074 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovnkube-script-lib\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196115 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196126 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-node-log\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196138 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196159 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-slash\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196180 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-node-log" (OuterVolumeSpecName: "node-log") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196197 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-env-overrides\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196208 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-slash" (OuterVolumeSpecName: "host-slash") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196229 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-run-systemd\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196257 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovn-node-metrics-cert\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196307 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-run-openvswitch\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196333 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-cni-netd\") pod \"4607f544-e6b3-4188-9b33-c638dfb1bda4\" (UID: \"4607f544-e6b3-4188-9b33-c638dfb1bda4\") " Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196534 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196537 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cf5df20b-2e81-4fe6-8ed1-666a2434e627-env-overrides\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196565 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196578 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196591 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-var-lib-openvswitch\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196606 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196695 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-run-netns\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196727 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-run-ovn-kubernetes\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196724 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196793 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-kubelet\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196839 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-run-systemd\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196866 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-log-socket\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196895 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-cni-bin\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196951 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-run-ovn\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.196982 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-systemd-units\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197017 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-slash\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197048 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197099 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-etc-openvswitch\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197130 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgc65\" (UniqueName: \"kubernetes.io/projected/cf5df20b-2e81-4fe6-8ed1-666a2434e627-kube-api-access-kgc65\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197153 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cf5df20b-2e81-4fe6-8ed1-666a2434e627-ovnkube-config\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197179 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cf5df20b-2e81-4fe6-8ed1-666a2434e627-ovnkube-script-lib\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197210 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cf5df20b-2e81-4fe6-8ed1-666a2434e627-ovn-node-metrics-cert\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197244 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-run-openvswitch\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197264 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-node-log\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197284 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-cni-netd\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197350 4923 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197365 4923 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-node-log\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197378 4923 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-slash\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197392 4923 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197404 4923 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197416 4923 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197428 4923 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197439 4923 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197451 4923 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197465 4923 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197477 4923 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197490 4923 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197502 4923 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197514 4923 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197525 4923 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-log-socket\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197538 4923 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.197551 4923 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.201589 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4607f544-e6b3-4188-9b33-c638dfb1bda4-kube-api-access-2tkvl" (OuterVolumeSpecName: "kube-api-access-2tkvl") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "kube-api-access-2tkvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.202792 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.209827 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "4607f544-e6b3-4188-9b33-c638dfb1bda4" (UID: "4607f544-e6b3-4188-9b33-c638dfb1bda4"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298547 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-run-netns\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298563 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-run-netns\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298650 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-run-ovn-kubernetes\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298681 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-run-ovn-kubernetes\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298692 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-kubelet\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298741 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-run-systemd\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298769 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-log-socket\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298801 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-cni-bin\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298807 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-kubelet\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298836 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-run-ovn\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298871 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-cni-bin\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298875 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-log-socket\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298872 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-systemd-units\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298900 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-systemd-units\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298914 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-run-systemd\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298930 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-slash\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298943 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-run-ovn\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298961 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.298999 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-etc-openvswitch\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299033 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgc65\" (UniqueName: \"kubernetes.io/projected/cf5df20b-2e81-4fe6-8ed1-666a2434e627-kube-api-access-kgc65\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299059 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cf5df20b-2e81-4fe6-8ed1-666a2434e627-ovnkube-config\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299064 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-etc-openvswitch\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299034 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299088 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cf5df20b-2e81-4fe6-8ed1-666a2434e627-ovnkube-script-lib\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299091 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-slash\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299133 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cf5df20b-2e81-4fe6-8ed1-666a2434e627-ovn-node-metrics-cert\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299210 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-run-openvswitch\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299240 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-cni-netd\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299259 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-node-log\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299316 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cf5df20b-2e81-4fe6-8ed1-666a2434e627-env-overrides\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299354 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-var-lib-openvswitch\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299355 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-run-openvswitch\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299454 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-var-lib-openvswitch\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299478 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-node-log\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299508 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cf5df20b-2e81-4fe6-8ed1-666a2434e627-host-cni-netd\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299543 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tkvl\" (UniqueName: \"kubernetes.io/projected/4607f544-e6b3-4188-9b33-c638dfb1bda4-kube-api-access-2tkvl\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299577 4923 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4607f544-e6b3-4188-9b33-c638dfb1bda4-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.299598 4923 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4607f544-e6b3-4188-9b33-c638dfb1bda4-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.300015 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cf5df20b-2e81-4fe6-8ed1-666a2434e627-ovnkube-script-lib\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.300077 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cf5df20b-2e81-4fe6-8ed1-666a2434e627-env-overrides\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.300202 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cf5df20b-2e81-4fe6-8ed1-666a2434e627-ovnkube-config\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.302944 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cf5df20b-2e81-4fe6-8ed1-666a2434e627-ovn-node-metrics-cert\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.313796 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgc65\" (UniqueName: \"kubernetes.io/projected/cf5df20b-2e81-4fe6-8ed1-666a2434e627-kube-api-access-kgc65\") pod \"ovnkube-node-x87r6\" (UID: \"cf5df20b-2e81-4fe6-8ed1-666a2434e627\") " pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: I0224 03:06:30.469928 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:30 crc kubenswrapper[4923]: W0224 03:06:30.491349 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf5df20b_2e81_4fe6_8ed1_666a2434e627.slice/crio-82bc3eaad8c00e1da29ce6908ff817dceb36a516340b9d5488f3135036702c1b WatchSource:0}: Error finding container 82bc3eaad8c00e1da29ce6908ff817dceb36a516340b9d5488f3135036702c1b: Status 404 returned error can't find the container with id 82bc3eaad8c00e1da29ce6908ff817dceb36a516340b9d5488f3135036702c1b Feb 24 03:06:31 crc kubenswrapper[4923]: I0224 03:06:31.020456 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c74d9_062c912d-c8a5-4312-b691-dc6488667f7d/kube-multus/0.log" Feb 24 03:06:31 crc kubenswrapper[4923]: I0224 03:06:31.020547 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c74d9" event={"ID":"062c912d-c8a5-4312-b691-dc6488667f7d","Type":"ContainerStarted","Data":"3b2575a16dccc8335ac3e62e6a4315c47686ba59718a39933c2730fb1ce8c216"} Feb 24 03:06:31 crc kubenswrapper[4923]: I0224 03:06:31.024690 4923 generic.go:334] "Generic (PLEG): container finished" podID="cf5df20b-2e81-4fe6-8ed1-666a2434e627" containerID="06f60d564a57fb0033c2f992eb34f117c001161e74cc8fe8debe0366d7aa2bf6" exitCode=0 Feb 24 03:06:31 crc kubenswrapper[4923]: I0224 03:06:31.024809 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-42sfg" Feb 24 03:06:31 crc kubenswrapper[4923]: I0224 03:06:31.024788 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" event={"ID":"cf5df20b-2e81-4fe6-8ed1-666a2434e627","Type":"ContainerDied","Data":"06f60d564a57fb0033c2f992eb34f117c001161e74cc8fe8debe0366d7aa2bf6"} Feb 24 03:06:31 crc kubenswrapper[4923]: I0224 03:06:31.025020 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" event={"ID":"cf5df20b-2e81-4fe6-8ed1-666a2434e627","Type":"ContainerStarted","Data":"82bc3eaad8c00e1da29ce6908ff817dceb36a516340b9d5488f3135036702c1b"} Feb 24 03:06:31 crc kubenswrapper[4923]: I0224 03:06:31.078739 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-42sfg"] Feb 24 03:06:31 crc kubenswrapper[4923]: I0224 03:06:31.091423 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-42sfg"] Feb 24 03:06:31 crc kubenswrapper[4923]: I0224 03:06:31.719984 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4607f544-e6b3-4188-9b33-c638dfb1bda4" path="/var/lib/kubelet/pods/4607f544-e6b3-4188-9b33-c638dfb1bda4/volumes" Feb 24 03:06:32 crc kubenswrapper[4923]: I0224 03:06:32.034886 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" event={"ID":"cf5df20b-2e81-4fe6-8ed1-666a2434e627","Type":"ContainerStarted","Data":"af5e773140d09ac8a0d255c90d3963ce47414cd849f8e471a485a31390dc4c65"} Feb 24 03:06:32 crc kubenswrapper[4923]: I0224 03:06:32.035211 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" event={"ID":"cf5df20b-2e81-4fe6-8ed1-666a2434e627","Type":"ContainerStarted","Data":"c2e2294a34397fccd1089c1330ca2327a521652474d49f0a3bad3a664e523827"} Feb 24 03:06:32 crc kubenswrapper[4923]: I0224 03:06:32.035234 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" event={"ID":"cf5df20b-2e81-4fe6-8ed1-666a2434e627","Type":"ContainerStarted","Data":"a7f7b60eb8c6f6d6a27e956553104f58c6e229c17b4c15d8f1a43bf4dee2e927"} Feb 24 03:06:32 crc kubenswrapper[4923]: I0224 03:06:32.035247 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" event={"ID":"cf5df20b-2e81-4fe6-8ed1-666a2434e627","Type":"ContainerStarted","Data":"cd0b981a0a7f4457c2cffd94c99ba54d45fded72eed84fcb6219c28d97004b97"} Feb 24 03:06:32 crc kubenswrapper[4923]: I0224 03:06:32.035259 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" event={"ID":"cf5df20b-2e81-4fe6-8ed1-666a2434e627","Type":"ContainerStarted","Data":"b32353a59a6327f23ff203cb4fe9e6179d4bfe6f3481386e4a8890fcb812e153"} Feb 24 03:06:32 crc kubenswrapper[4923]: I0224 03:06:32.035272 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" event={"ID":"cf5df20b-2e81-4fe6-8ed1-666a2434e627","Type":"ContainerStarted","Data":"50a28636449601852381b25fd2b20f3ca07bfeba0c6f1597d9e0b3ef4f14d591"} Feb 24 03:06:35 crc kubenswrapper[4923]: I0224 03:06:35.057617 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" event={"ID":"cf5df20b-2e81-4fe6-8ed1-666a2434e627","Type":"ContainerStarted","Data":"48bf11521ed7ca008cfcf728be9d5ab7e249b0ac82ff16e5048fae8479e9f3ad"} Feb 24 03:06:37 crc kubenswrapper[4923]: I0224 03:06:37.103178 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" event={"ID":"cf5df20b-2e81-4fe6-8ed1-666a2434e627","Type":"ContainerStarted","Data":"ff8accca7673bc6ac011cc98cf1fb9d4b0d464008c74a40c71b915b37527f50d"} Feb 24 03:06:37 crc kubenswrapper[4923]: I0224 03:06:37.104088 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:37 crc kubenswrapper[4923]: I0224 03:06:37.104127 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:37 crc kubenswrapper[4923]: I0224 03:06:37.137456 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:37 crc kubenswrapper[4923]: I0224 03:06:37.147523 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" podStartSLOduration=7.147500635 podStartE2EDuration="7.147500635s" podCreationTimestamp="2026-02-24 03:06:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:06:37.142956381 +0000 UTC m=+721.160027214" watchObservedRunningTime="2026-02-24 03:06:37.147500635 +0000 UTC m=+721.164571458" Feb 24 03:06:38 crc kubenswrapper[4923]: I0224 03:06:38.109879 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:38 crc kubenswrapper[4923]: I0224 03:06:38.140969 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:06:39 crc kubenswrapper[4923]: I0224 03:06:39.974600 4923 scope.go:117] "RemoveContainer" containerID="e8fcb85b60d31246ae99915253a8e7f54f97f112576ce0f72298af111bfd8913" Feb 24 03:06:39 crc kubenswrapper[4923]: I0224 03:06:39.998181 4923 scope.go:117] "RemoveContainer" containerID="83f7a60316410df3d5d5e554238f282a294d98d0a3801e5d4a1fd983d3a778a1" Feb 24 03:06:40 crc kubenswrapper[4923]: I0224 03:06:40.022629 4923 scope.go:117] "RemoveContainer" containerID="7504f1c88847e620e449f4f4a803c9f641abb29c95bfcb5ff4e88a7c5135f113" Feb 24 03:06:40 crc kubenswrapper[4923]: I0224 03:06:40.048180 4923 scope.go:117] "RemoveContainer" containerID="0762e472a6d74f5219ca62ad13723c38f97f1ba9f52157ecf228d84623eaf3d5" Feb 24 03:06:40 crc kubenswrapper[4923]: I0224 03:06:40.073887 4923 scope.go:117] "RemoveContainer" containerID="aa26ad1aff7dedd659bb4fc9764cebae5f1aadd4c2651488e35d7342480a46d8" Feb 24 03:06:40 crc kubenswrapper[4923]: I0224 03:06:40.086434 4923 scope.go:117] "RemoveContainer" containerID="4e6c422888a838b09f87f0610515692a5e75561f0c1b398cd083abdde442ed77" Feb 24 03:06:40 crc kubenswrapper[4923]: I0224 03:06:40.101579 4923 scope.go:117] "RemoveContainer" containerID="bfb689c9f3fdd896e5b3f0cae8dfad3202a2fdace3d62893ab7998a2e8a89c9e" Feb 24 03:06:40 crc kubenswrapper[4923]: I0224 03:06:40.123977 4923 scope.go:117] "RemoveContainer" containerID="96ba114882b2a7940916954fd6b959b6aea49bb56099cb8754a4acda08ee1f4a" Feb 24 03:06:40 crc kubenswrapper[4923]: I0224 03:06:40.139943 4923 scope.go:117] "RemoveContainer" containerID="55d1c59325bdc8422882f9054db3a2a544f0676067e2d657be7563a9d41a8e5d" Feb 24 03:06:47 crc kubenswrapper[4923]: I0224 03:06:47.867006 4923 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 24 03:06:49 crc kubenswrapper[4923]: I0224 03:06:49.916405 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:06:49 crc kubenswrapper[4923]: I0224 03:06:49.916737 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:06:49 crc kubenswrapper[4923]: I0224 03:06:49.916783 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 03:06:49 crc kubenswrapper[4923]: I0224 03:06:49.917280 4923 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dd9566916d0707d0d74e9c129e61b45fcf01bae77c86a9c663fcebb809b372a3"} pod="openshift-machine-config-operator/machine-config-daemon-rh26t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 03:06:49 crc kubenswrapper[4923]: I0224 03:06:49.917369 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" containerID="cri-o://dd9566916d0707d0d74e9c129e61b45fcf01bae77c86a9c663fcebb809b372a3" gracePeriod=600 Feb 24 03:06:50 crc kubenswrapper[4923]: I0224 03:06:50.207253 4923 generic.go:334] "Generic (PLEG): container finished" podID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerID="dd9566916d0707d0d74e9c129e61b45fcf01bae77c86a9c663fcebb809b372a3" exitCode=0 Feb 24 03:06:50 crc kubenswrapper[4923]: I0224 03:06:50.207825 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerDied","Data":"dd9566916d0707d0d74e9c129e61b45fcf01bae77c86a9c663fcebb809b372a3"} Feb 24 03:06:50 crc kubenswrapper[4923]: I0224 03:06:50.207921 4923 scope.go:117] "RemoveContainer" containerID="e2c4fc96d859c5f960586857b5a88ab66b5662816e6463364ed18c251990f0e2" Feb 24 03:06:51 crc kubenswrapper[4923]: I0224 03:06:51.216120 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerStarted","Data":"369652d4e2fcdce7839d154f1d90c85b55a365ec3b7c320fea7e81e6fe472c3d"} Feb 24 03:07:00 crc kubenswrapper[4923]: I0224 03:07:00.545253 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x87r6" Feb 24 03:07:07 crc kubenswrapper[4923]: I0224 03:07:07.241608 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr"] Feb 24 03:07:07 crc kubenswrapper[4923]: I0224 03:07:07.243289 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" Feb 24 03:07:07 crc kubenswrapper[4923]: I0224 03:07:07.244847 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 24 03:07:07 crc kubenswrapper[4923]: I0224 03:07:07.249455 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr"] Feb 24 03:07:07 crc kubenswrapper[4923]: I0224 03:07:07.337367 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szdk2\" (UniqueName: \"kubernetes.io/projected/01359d36-26ee-45ab-83f1-2cc5a8f360be-kube-api-access-szdk2\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr\" (UID: \"01359d36-26ee-45ab-83f1-2cc5a8f360be\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" Feb 24 03:07:07 crc kubenswrapper[4923]: I0224 03:07:07.337427 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01359d36-26ee-45ab-83f1-2cc5a8f360be-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr\" (UID: \"01359d36-26ee-45ab-83f1-2cc5a8f360be\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" Feb 24 03:07:07 crc kubenswrapper[4923]: I0224 03:07:07.337579 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01359d36-26ee-45ab-83f1-2cc5a8f360be-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr\" (UID: \"01359d36-26ee-45ab-83f1-2cc5a8f360be\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" Feb 24 03:07:07 crc kubenswrapper[4923]: I0224 03:07:07.439421 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szdk2\" (UniqueName: \"kubernetes.io/projected/01359d36-26ee-45ab-83f1-2cc5a8f360be-kube-api-access-szdk2\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr\" (UID: \"01359d36-26ee-45ab-83f1-2cc5a8f360be\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" Feb 24 03:07:07 crc kubenswrapper[4923]: I0224 03:07:07.439477 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01359d36-26ee-45ab-83f1-2cc5a8f360be-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr\" (UID: \"01359d36-26ee-45ab-83f1-2cc5a8f360be\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" Feb 24 03:07:07 crc kubenswrapper[4923]: I0224 03:07:07.439507 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01359d36-26ee-45ab-83f1-2cc5a8f360be-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr\" (UID: \"01359d36-26ee-45ab-83f1-2cc5a8f360be\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" Feb 24 03:07:07 crc kubenswrapper[4923]: I0224 03:07:07.439874 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01359d36-26ee-45ab-83f1-2cc5a8f360be-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr\" (UID: \"01359d36-26ee-45ab-83f1-2cc5a8f360be\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" Feb 24 03:07:07 crc kubenswrapper[4923]: I0224 03:07:07.440258 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01359d36-26ee-45ab-83f1-2cc5a8f360be-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr\" (UID: \"01359d36-26ee-45ab-83f1-2cc5a8f360be\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" Feb 24 03:07:07 crc kubenswrapper[4923]: I0224 03:07:07.520453 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szdk2\" (UniqueName: \"kubernetes.io/projected/01359d36-26ee-45ab-83f1-2cc5a8f360be-kube-api-access-szdk2\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr\" (UID: \"01359d36-26ee-45ab-83f1-2cc5a8f360be\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" Feb 24 03:07:07 crc kubenswrapper[4923]: I0224 03:07:07.556905 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" Feb 24 03:07:07 crc kubenswrapper[4923]: I0224 03:07:07.950940 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr"] Feb 24 03:07:08 crc kubenswrapper[4923]: I0224 03:07:08.319407 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" event={"ID":"01359d36-26ee-45ab-83f1-2cc5a8f360be","Type":"ContainerStarted","Data":"6b3c8ed108af09eca59582ef0f85d7e1c2198850ab721ca6e01137cb34619c00"} Feb 24 03:07:08 crc kubenswrapper[4923]: I0224 03:07:08.319813 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" event={"ID":"01359d36-26ee-45ab-83f1-2cc5a8f360be","Type":"ContainerStarted","Data":"3235dc788abd7cc124f6c2183d8d264a2f060646939445b4c6c80f4272f8e823"} Feb 24 03:07:09 crc kubenswrapper[4923]: I0224 03:07:09.329589 4923 generic.go:334] "Generic (PLEG): container finished" podID="01359d36-26ee-45ab-83f1-2cc5a8f360be" containerID="6b3c8ed108af09eca59582ef0f85d7e1c2198850ab721ca6e01137cb34619c00" exitCode=0 Feb 24 03:07:09 crc kubenswrapper[4923]: I0224 03:07:09.329669 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" event={"ID":"01359d36-26ee-45ab-83f1-2cc5a8f360be","Type":"ContainerDied","Data":"6b3c8ed108af09eca59582ef0f85d7e1c2198850ab721ca6e01137cb34619c00"} Feb 24 03:07:09 crc kubenswrapper[4923]: I0224 03:07:09.611866 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ckz4f"] Feb 24 03:07:09 crc kubenswrapper[4923]: I0224 03:07:09.614279 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ckz4f" Feb 24 03:07:09 crc kubenswrapper[4923]: I0224 03:07:09.619098 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ckz4f"] Feb 24 03:07:09 crc kubenswrapper[4923]: I0224 03:07:09.766958 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bd25eab-ecb4-4c07-9da3-500645a2f2c3-catalog-content\") pod \"redhat-operators-ckz4f\" (UID: \"7bd25eab-ecb4-4c07-9da3-500645a2f2c3\") " pod="openshift-marketplace/redhat-operators-ckz4f" Feb 24 03:07:09 crc kubenswrapper[4923]: I0224 03:07:09.767025 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf8qp\" (UniqueName: \"kubernetes.io/projected/7bd25eab-ecb4-4c07-9da3-500645a2f2c3-kube-api-access-sf8qp\") pod \"redhat-operators-ckz4f\" (UID: \"7bd25eab-ecb4-4c07-9da3-500645a2f2c3\") " pod="openshift-marketplace/redhat-operators-ckz4f" Feb 24 03:07:09 crc kubenswrapper[4923]: I0224 03:07:09.767076 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bd25eab-ecb4-4c07-9da3-500645a2f2c3-utilities\") pod \"redhat-operators-ckz4f\" (UID: \"7bd25eab-ecb4-4c07-9da3-500645a2f2c3\") " pod="openshift-marketplace/redhat-operators-ckz4f" Feb 24 03:07:09 crc kubenswrapper[4923]: I0224 03:07:09.868366 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bd25eab-ecb4-4c07-9da3-500645a2f2c3-utilities\") pod \"redhat-operators-ckz4f\" (UID: \"7bd25eab-ecb4-4c07-9da3-500645a2f2c3\") " pod="openshift-marketplace/redhat-operators-ckz4f" Feb 24 03:07:09 crc kubenswrapper[4923]: I0224 03:07:09.868521 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bd25eab-ecb4-4c07-9da3-500645a2f2c3-catalog-content\") pod \"redhat-operators-ckz4f\" (UID: \"7bd25eab-ecb4-4c07-9da3-500645a2f2c3\") " pod="openshift-marketplace/redhat-operators-ckz4f" Feb 24 03:07:09 crc kubenswrapper[4923]: I0224 03:07:09.869004 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bd25eab-ecb4-4c07-9da3-500645a2f2c3-utilities\") pod \"redhat-operators-ckz4f\" (UID: \"7bd25eab-ecb4-4c07-9da3-500645a2f2c3\") " pod="openshift-marketplace/redhat-operators-ckz4f" Feb 24 03:07:09 crc kubenswrapper[4923]: I0224 03:07:09.869049 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bd25eab-ecb4-4c07-9da3-500645a2f2c3-catalog-content\") pod \"redhat-operators-ckz4f\" (UID: \"7bd25eab-ecb4-4c07-9da3-500645a2f2c3\") " pod="openshift-marketplace/redhat-operators-ckz4f" Feb 24 03:07:09 crc kubenswrapper[4923]: I0224 03:07:09.869219 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf8qp\" (UniqueName: \"kubernetes.io/projected/7bd25eab-ecb4-4c07-9da3-500645a2f2c3-kube-api-access-sf8qp\") pod \"redhat-operators-ckz4f\" (UID: \"7bd25eab-ecb4-4c07-9da3-500645a2f2c3\") " pod="openshift-marketplace/redhat-operators-ckz4f" Feb 24 03:07:09 crc kubenswrapper[4923]: I0224 03:07:09.891798 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf8qp\" (UniqueName: \"kubernetes.io/projected/7bd25eab-ecb4-4c07-9da3-500645a2f2c3-kube-api-access-sf8qp\") pod \"redhat-operators-ckz4f\" (UID: \"7bd25eab-ecb4-4c07-9da3-500645a2f2c3\") " pod="openshift-marketplace/redhat-operators-ckz4f" Feb 24 03:07:09 crc kubenswrapper[4923]: I0224 03:07:09.947633 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ckz4f" Feb 24 03:07:10 crc kubenswrapper[4923]: I0224 03:07:10.135798 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ckz4f"] Feb 24 03:07:10 crc kubenswrapper[4923]: W0224 03:07:10.141184 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bd25eab_ecb4_4c07_9da3_500645a2f2c3.slice/crio-13112f4b1d179c8aff00dae11224de7e02b464e4d50bb34306e9f2488866fec5 WatchSource:0}: Error finding container 13112f4b1d179c8aff00dae11224de7e02b464e4d50bb34306e9f2488866fec5: Status 404 returned error can't find the container with id 13112f4b1d179c8aff00dae11224de7e02b464e4d50bb34306e9f2488866fec5 Feb 24 03:07:10 crc kubenswrapper[4923]: I0224 03:07:10.335601 4923 generic.go:334] "Generic (PLEG): container finished" podID="7bd25eab-ecb4-4c07-9da3-500645a2f2c3" containerID="b92c57da76c98e5a3e1212a6552281e70cdee57a6d113ba1f6e5a21f2ae7c106" exitCode=0 Feb 24 03:07:10 crc kubenswrapper[4923]: I0224 03:07:10.335646 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckz4f" event={"ID":"7bd25eab-ecb4-4c07-9da3-500645a2f2c3","Type":"ContainerDied","Data":"b92c57da76c98e5a3e1212a6552281e70cdee57a6d113ba1f6e5a21f2ae7c106"} Feb 24 03:07:10 crc kubenswrapper[4923]: I0224 03:07:10.335869 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckz4f" event={"ID":"7bd25eab-ecb4-4c07-9da3-500645a2f2c3","Type":"ContainerStarted","Data":"13112f4b1d179c8aff00dae11224de7e02b464e4d50bb34306e9f2488866fec5"} Feb 24 03:07:11 crc kubenswrapper[4923]: I0224 03:07:11.345483 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckz4f" event={"ID":"7bd25eab-ecb4-4c07-9da3-500645a2f2c3","Type":"ContainerStarted","Data":"61665d12c9b9f6ef6ea697f3808c874e04757137eee623b78b94b6c8d66a33ac"} Feb 24 03:07:11 crc kubenswrapper[4923]: I0224 03:07:11.348461 4923 generic.go:334] "Generic (PLEG): container finished" podID="01359d36-26ee-45ab-83f1-2cc5a8f360be" containerID="6df1e37bf43a65e2936dd42217640cc5ee01caccc21b68cafa468226b492c104" exitCode=0 Feb 24 03:07:11 crc kubenswrapper[4923]: I0224 03:07:11.348541 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" event={"ID":"01359d36-26ee-45ab-83f1-2cc5a8f360be","Type":"ContainerDied","Data":"6df1e37bf43a65e2936dd42217640cc5ee01caccc21b68cafa468226b492c104"} Feb 24 03:07:12 crc kubenswrapper[4923]: I0224 03:07:12.357857 4923 generic.go:334] "Generic (PLEG): container finished" podID="01359d36-26ee-45ab-83f1-2cc5a8f360be" containerID="92146d3c98c4e6d21b0310cbdfefbf09ffebcdf211e90337144c94a234265790" exitCode=0 Feb 24 03:07:12 crc kubenswrapper[4923]: I0224 03:07:12.357951 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" event={"ID":"01359d36-26ee-45ab-83f1-2cc5a8f360be","Type":"ContainerDied","Data":"92146d3c98c4e6d21b0310cbdfefbf09ffebcdf211e90337144c94a234265790"} Feb 24 03:07:12 crc kubenswrapper[4923]: I0224 03:07:12.361523 4923 generic.go:334] "Generic (PLEG): container finished" podID="7bd25eab-ecb4-4c07-9da3-500645a2f2c3" containerID="61665d12c9b9f6ef6ea697f3808c874e04757137eee623b78b94b6c8d66a33ac" exitCode=0 Feb 24 03:07:12 crc kubenswrapper[4923]: I0224 03:07:12.361612 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckz4f" event={"ID":"7bd25eab-ecb4-4c07-9da3-500645a2f2c3","Type":"ContainerDied","Data":"61665d12c9b9f6ef6ea697f3808c874e04757137eee623b78b94b6c8d66a33ac"} Feb 24 03:07:13 crc kubenswrapper[4923]: I0224 03:07:13.371776 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckz4f" event={"ID":"7bd25eab-ecb4-4c07-9da3-500645a2f2c3","Type":"ContainerStarted","Data":"5b74c945a42f0e4783961b7b6ebf194b9de30f41a8052ab03bf70b329cded646"} Feb 24 03:07:13 crc kubenswrapper[4923]: I0224 03:07:13.398805 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ckz4f" podStartSLOduration=1.812028644 podStartE2EDuration="4.398778263s" podCreationTimestamp="2026-02-24 03:07:09 +0000 UTC" firstStartedPulling="2026-02-24 03:07:10.337203729 +0000 UTC m=+754.354274562" lastFinishedPulling="2026-02-24 03:07:12.923953328 +0000 UTC m=+756.941024181" observedRunningTime="2026-02-24 03:07:13.39665438 +0000 UTC m=+757.413725223" watchObservedRunningTime="2026-02-24 03:07:13.398778263 +0000 UTC m=+757.415849106" Feb 24 03:07:13 crc kubenswrapper[4923]: I0224 03:07:13.646449 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" Feb 24 03:07:13 crc kubenswrapper[4923]: I0224 03:07:13.840904 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01359d36-26ee-45ab-83f1-2cc5a8f360be-bundle\") pod \"01359d36-26ee-45ab-83f1-2cc5a8f360be\" (UID: \"01359d36-26ee-45ab-83f1-2cc5a8f360be\") " Feb 24 03:07:13 crc kubenswrapper[4923]: I0224 03:07:13.840967 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01359d36-26ee-45ab-83f1-2cc5a8f360be-util\") pod \"01359d36-26ee-45ab-83f1-2cc5a8f360be\" (UID: \"01359d36-26ee-45ab-83f1-2cc5a8f360be\") " Feb 24 03:07:13 crc kubenswrapper[4923]: I0224 03:07:13.841030 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szdk2\" (UniqueName: \"kubernetes.io/projected/01359d36-26ee-45ab-83f1-2cc5a8f360be-kube-api-access-szdk2\") pod \"01359d36-26ee-45ab-83f1-2cc5a8f360be\" (UID: \"01359d36-26ee-45ab-83f1-2cc5a8f360be\") " Feb 24 03:07:13 crc kubenswrapper[4923]: I0224 03:07:13.843400 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01359d36-26ee-45ab-83f1-2cc5a8f360be-bundle" (OuterVolumeSpecName: "bundle") pod "01359d36-26ee-45ab-83f1-2cc5a8f360be" (UID: "01359d36-26ee-45ab-83f1-2cc5a8f360be"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:07:13 crc kubenswrapper[4923]: I0224 03:07:13.849545 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01359d36-26ee-45ab-83f1-2cc5a8f360be-kube-api-access-szdk2" (OuterVolumeSpecName: "kube-api-access-szdk2") pod "01359d36-26ee-45ab-83f1-2cc5a8f360be" (UID: "01359d36-26ee-45ab-83f1-2cc5a8f360be"). InnerVolumeSpecName "kube-api-access-szdk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:07:13 crc kubenswrapper[4923]: I0224 03:07:13.942863 4923 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01359d36-26ee-45ab-83f1-2cc5a8f360be-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:07:13 crc kubenswrapper[4923]: I0224 03:07:13.943486 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szdk2\" (UniqueName: \"kubernetes.io/projected/01359d36-26ee-45ab-83f1-2cc5a8f360be-kube-api-access-szdk2\") on node \"crc\" DevicePath \"\"" Feb 24 03:07:14 crc kubenswrapper[4923]: I0224 03:07:14.079065 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01359d36-26ee-45ab-83f1-2cc5a8f360be-util" (OuterVolumeSpecName: "util") pod "01359d36-26ee-45ab-83f1-2cc5a8f360be" (UID: "01359d36-26ee-45ab-83f1-2cc5a8f360be"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:07:14 crc kubenswrapper[4923]: I0224 03:07:14.146939 4923 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01359d36-26ee-45ab-83f1-2cc5a8f360be-util\") on node \"crc\" DevicePath \"\"" Feb 24 03:07:14 crc kubenswrapper[4923]: I0224 03:07:14.378140 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" Feb 24 03:07:14 crc kubenswrapper[4923]: I0224 03:07:14.378160 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr" event={"ID":"01359d36-26ee-45ab-83f1-2cc5a8f360be","Type":"ContainerDied","Data":"3235dc788abd7cc124f6c2183d8d264a2f060646939445b4c6c80f4272f8e823"} Feb 24 03:07:14 crc kubenswrapper[4923]: I0224 03:07:14.378218 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3235dc788abd7cc124f6c2183d8d264a2f060646939445b4c6c80f4272f8e823" Feb 24 03:07:17 crc kubenswrapper[4923]: I0224 03:07:17.144640 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-k582j"] Feb 24 03:07:17 crc kubenswrapper[4923]: E0224 03:07:17.145180 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01359d36-26ee-45ab-83f1-2cc5a8f360be" containerName="util" Feb 24 03:07:17 crc kubenswrapper[4923]: I0224 03:07:17.145196 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="01359d36-26ee-45ab-83f1-2cc5a8f360be" containerName="util" Feb 24 03:07:17 crc kubenswrapper[4923]: E0224 03:07:17.145212 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01359d36-26ee-45ab-83f1-2cc5a8f360be" containerName="extract" Feb 24 03:07:17 crc kubenswrapper[4923]: I0224 03:07:17.145219 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="01359d36-26ee-45ab-83f1-2cc5a8f360be" containerName="extract" Feb 24 03:07:17 crc kubenswrapper[4923]: E0224 03:07:17.145234 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01359d36-26ee-45ab-83f1-2cc5a8f360be" containerName="pull" Feb 24 03:07:17 crc kubenswrapper[4923]: I0224 03:07:17.145241 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="01359d36-26ee-45ab-83f1-2cc5a8f360be" containerName="pull" Feb 24 03:07:17 crc kubenswrapper[4923]: I0224 03:07:17.145349 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="01359d36-26ee-45ab-83f1-2cc5a8f360be" containerName="extract" Feb 24 03:07:17 crc kubenswrapper[4923]: I0224 03:07:17.145705 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-k582j" Feb 24 03:07:17 crc kubenswrapper[4923]: I0224 03:07:17.147904 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 24 03:07:17 crc kubenswrapper[4923]: I0224 03:07:17.147981 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 24 03:07:17 crc kubenswrapper[4923]: I0224 03:07:17.148487 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-l8dx6" Feb 24 03:07:17 crc kubenswrapper[4923]: I0224 03:07:17.187864 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-k582j"] Feb 24 03:07:17 crc kubenswrapper[4923]: I0224 03:07:17.189160 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b767\" (UniqueName: \"kubernetes.io/projected/0bbb266f-3eb2-4f05-8d57-9c6ac88c83fd-kube-api-access-4b767\") pod \"nmstate-operator-694c9596b7-k582j\" (UID: \"0bbb266f-3eb2-4f05-8d57-9c6ac88c83fd\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-k582j" Feb 24 03:07:17 crc kubenswrapper[4923]: I0224 03:07:17.290371 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b767\" (UniqueName: \"kubernetes.io/projected/0bbb266f-3eb2-4f05-8d57-9c6ac88c83fd-kube-api-access-4b767\") pod \"nmstate-operator-694c9596b7-k582j\" (UID: \"0bbb266f-3eb2-4f05-8d57-9c6ac88c83fd\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-k582j" Feb 24 03:07:17 crc kubenswrapper[4923]: I0224 03:07:17.308909 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b767\" (UniqueName: \"kubernetes.io/projected/0bbb266f-3eb2-4f05-8d57-9c6ac88c83fd-kube-api-access-4b767\") pod \"nmstate-operator-694c9596b7-k582j\" (UID: \"0bbb266f-3eb2-4f05-8d57-9c6ac88c83fd\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-k582j" Feb 24 03:07:17 crc kubenswrapper[4923]: I0224 03:07:17.459883 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-k582j" Feb 24 03:07:17 crc kubenswrapper[4923]: I0224 03:07:17.665204 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-k582j"] Feb 24 03:07:17 crc kubenswrapper[4923]: W0224 03:07:17.674582 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bbb266f_3eb2_4f05_8d57_9c6ac88c83fd.slice/crio-3aa3e7f854aa8ffaecd13107520b909c48c137a3f5947cbbca2fb2453aaa164d WatchSource:0}: Error finding container 3aa3e7f854aa8ffaecd13107520b909c48c137a3f5947cbbca2fb2453aaa164d: Status 404 returned error can't find the container with id 3aa3e7f854aa8ffaecd13107520b909c48c137a3f5947cbbca2fb2453aaa164d Feb 24 03:07:18 crc kubenswrapper[4923]: I0224 03:07:18.404631 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-k582j" event={"ID":"0bbb266f-3eb2-4f05-8d57-9c6ac88c83fd","Type":"ContainerStarted","Data":"3aa3e7f854aa8ffaecd13107520b909c48c137a3f5947cbbca2fb2453aaa164d"} Feb 24 03:07:19 crc kubenswrapper[4923]: I0224 03:07:19.947815 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ckz4f" Feb 24 03:07:19 crc kubenswrapper[4923]: I0224 03:07:19.948192 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ckz4f" Feb 24 03:07:20 crc kubenswrapper[4923]: I0224 03:07:20.004252 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ckz4f" Feb 24 03:07:20 crc kubenswrapper[4923]: I0224 03:07:20.420620 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-k582j" event={"ID":"0bbb266f-3eb2-4f05-8d57-9c6ac88c83fd","Type":"ContainerStarted","Data":"54dd5be6ddfb7fad9278b449635655294b180303be4d024da6d9edabb3801e59"} Feb 24 03:07:20 crc kubenswrapper[4923]: I0224 03:07:20.447765 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-k582j" podStartSLOduration=1.016915575 podStartE2EDuration="3.447746324s" podCreationTimestamp="2026-02-24 03:07:17 +0000 UTC" firstStartedPulling="2026-02-24 03:07:17.676009809 +0000 UTC m=+761.693080622" lastFinishedPulling="2026-02-24 03:07:20.106840558 +0000 UTC m=+764.123911371" observedRunningTime="2026-02-24 03:07:20.442736118 +0000 UTC m=+764.459806961" watchObservedRunningTime="2026-02-24 03:07:20.447746324 +0000 UTC m=+764.464817137" Feb 24 03:07:20 crc kubenswrapper[4923]: I0224 03:07:20.460080 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ckz4f" Feb 24 03:07:22 crc kubenswrapper[4923]: I0224 03:07:22.401185 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ckz4f"] Feb 24 03:07:22 crc kubenswrapper[4923]: I0224 03:07:22.434020 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ckz4f" podUID="7bd25eab-ecb4-4c07-9da3-500645a2f2c3" containerName="registry-server" containerID="cri-o://5b74c945a42f0e4783961b7b6ebf194b9de30f41a8052ab03bf70b329cded646" gracePeriod=2 Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.346990 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ckz4f" Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.442490 4923 generic.go:334] "Generic (PLEG): container finished" podID="7bd25eab-ecb4-4c07-9da3-500645a2f2c3" containerID="5b74c945a42f0e4783961b7b6ebf194b9de30f41a8052ab03bf70b329cded646" exitCode=0 Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.442558 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckz4f" event={"ID":"7bd25eab-ecb4-4c07-9da3-500645a2f2c3","Type":"ContainerDied","Data":"5b74c945a42f0e4783961b7b6ebf194b9de30f41a8052ab03bf70b329cded646"} Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.442615 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckz4f" event={"ID":"7bd25eab-ecb4-4c07-9da3-500645a2f2c3","Type":"ContainerDied","Data":"13112f4b1d179c8aff00dae11224de7e02b464e4d50bb34306e9f2488866fec5"} Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.442646 4923 scope.go:117] "RemoveContainer" containerID="5b74c945a42f0e4783961b7b6ebf194b9de30f41a8052ab03bf70b329cded646" Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.442728 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ckz4f" Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.463719 4923 scope.go:117] "RemoveContainer" containerID="61665d12c9b9f6ef6ea697f3808c874e04757137eee623b78b94b6c8d66a33ac" Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.487982 4923 scope.go:117] "RemoveContainer" containerID="b92c57da76c98e5a3e1212a6552281e70cdee57a6d113ba1f6e5a21f2ae7c106" Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.488680 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf8qp\" (UniqueName: \"kubernetes.io/projected/7bd25eab-ecb4-4c07-9da3-500645a2f2c3-kube-api-access-sf8qp\") pod \"7bd25eab-ecb4-4c07-9da3-500645a2f2c3\" (UID: \"7bd25eab-ecb4-4c07-9da3-500645a2f2c3\") " Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.488737 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bd25eab-ecb4-4c07-9da3-500645a2f2c3-utilities\") pod \"7bd25eab-ecb4-4c07-9da3-500645a2f2c3\" (UID: \"7bd25eab-ecb4-4c07-9da3-500645a2f2c3\") " Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.488847 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bd25eab-ecb4-4c07-9da3-500645a2f2c3-catalog-content\") pod \"7bd25eab-ecb4-4c07-9da3-500645a2f2c3\" (UID: \"7bd25eab-ecb4-4c07-9da3-500645a2f2c3\") " Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.490604 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bd25eab-ecb4-4c07-9da3-500645a2f2c3-utilities" (OuterVolumeSpecName: "utilities") pod "7bd25eab-ecb4-4c07-9da3-500645a2f2c3" (UID: "7bd25eab-ecb4-4c07-9da3-500645a2f2c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.498629 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bd25eab-ecb4-4c07-9da3-500645a2f2c3-kube-api-access-sf8qp" (OuterVolumeSpecName: "kube-api-access-sf8qp") pod "7bd25eab-ecb4-4c07-9da3-500645a2f2c3" (UID: "7bd25eab-ecb4-4c07-9da3-500645a2f2c3"). InnerVolumeSpecName "kube-api-access-sf8qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.547524 4923 scope.go:117] "RemoveContainer" containerID="5b74c945a42f0e4783961b7b6ebf194b9de30f41a8052ab03bf70b329cded646" Feb 24 03:07:23 crc kubenswrapper[4923]: E0224 03:07:23.548182 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b74c945a42f0e4783961b7b6ebf194b9de30f41a8052ab03bf70b329cded646\": container with ID starting with 5b74c945a42f0e4783961b7b6ebf194b9de30f41a8052ab03bf70b329cded646 not found: ID does not exist" containerID="5b74c945a42f0e4783961b7b6ebf194b9de30f41a8052ab03bf70b329cded646" Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.548218 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b74c945a42f0e4783961b7b6ebf194b9de30f41a8052ab03bf70b329cded646"} err="failed to get container status \"5b74c945a42f0e4783961b7b6ebf194b9de30f41a8052ab03bf70b329cded646\": rpc error: code = NotFound desc = could not find container \"5b74c945a42f0e4783961b7b6ebf194b9de30f41a8052ab03bf70b329cded646\": container with ID starting with 5b74c945a42f0e4783961b7b6ebf194b9de30f41a8052ab03bf70b329cded646 not found: ID does not exist" Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.548244 4923 scope.go:117] "RemoveContainer" containerID="61665d12c9b9f6ef6ea697f3808c874e04757137eee623b78b94b6c8d66a33ac" Feb 24 03:07:23 crc kubenswrapper[4923]: E0224 03:07:23.549027 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61665d12c9b9f6ef6ea697f3808c874e04757137eee623b78b94b6c8d66a33ac\": container with ID starting with 61665d12c9b9f6ef6ea697f3808c874e04757137eee623b78b94b6c8d66a33ac not found: ID does not exist" containerID="61665d12c9b9f6ef6ea697f3808c874e04757137eee623b78b94b6c8d66a33ac" Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.549063 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61665d12c9b9f6ef6ea697f3808c874e04757137eee623b78b94b6c8d66a33ac"} err="failed to get container status \"61665d12c9b9f6ef6ea697f3808c874e04757137eee623b78b94b6c8d66a33ac\": rpc error: code = NotFound desc = could not find container \"61665d12c9b9f6ef6ea697f3808c874e04757137eee623b78b94b6c8d66a33ac\": container with ID starting with 61665d12c9b9f6ef6ea697f3808c874e04757137eee623b78b94b6c8d66a33ac not found: ID does not exist" Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.549083 4923 scope.go:117] "RemoveContainer" containerID="b92c57da76c98e5a3e1212a6552281e70cdee57a6d113ba1f6e5a21f2ae7c106" Feb 24 03:07:23 crc kubenswrapper[4923]: E0224 03:07:23.549595 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b92c57da76c98e5a3e1212a6552281e70cdee57a6d113ba1f6e5a21f2ae7c106\": container with ID starting with b92c57da76c98e5a3e1212a6552281e70cdee57a6d113ba1f6e5a21f2ae7c106 not found: ID does not exist" containerID="b92c57da76c98e5a3e1212a6552281e70cdee57a6d113ba1f6e5a21f2ae7c106" Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.549683 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b92c57da76c98e5a3e1212a6552281e70cdee57a6d113ba1f6e5a21f2ae7c106"} err="failed to get container status \"b92c57da76c98e5a3e1212a6552281e70cdee57a6d113ba1f6e5a21f2ae7c106\": rpc error: code = NotFound desc = could not find container \"b92c57da76c98e5a3e1212a6552281e70cdee57a6d113ba1f6e5a21f2ae7c106\": container with ID starting with b92c57da76c98e5a3e1212a6552281e70cdee57a6d113ba1f6e5a21f2ae7c106 not found: ID does not exist" Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.590445 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf8qp\" (UniqueName: \"kubernetes.io/projected/7bd25eab-ecb4-4c07-9da3-500645a2f2c3-kube-api-access-sf8qp\") on node \"crc\" DevicePath \"\"" Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.590488 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bd25eab-ecb4-4c07-9da3-500645a2f2c3-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.644110 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bd25eab-ecb4-4c07-9da3-500645a2f2c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7bd25eab-ecb4-4c07-9da3-500645a2f2c3" (UID: "7bd25eab-ecb4-4c07-9da3-500645a2f2c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.691700 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bd25eab-ecb4-4c07-9da3-500645a2f2c3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.766906 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ckz4f"] Feb 24 03:07:23 crc kubenswrapper[4923]: I0224 03:07:23.773764 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ckz4f"] Feb 24 03:07:25 crc kubenswrapper[4923]: I0224 03:07:25.724025 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bd25eab-ecb4-4c07-9da3-500645a2f2c3" path="/var/lib/kubelet/pods/7bd25eab-ecb4-4c07-9da3-500645a2f2c3/volumes" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.720937 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-mnljj"] Feb 24 03:07:27 crc kubenswrapper[4923]: E0224 03:07:27.721170 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd25eab-ecb4-4c07-9da3-500645a2f2c3" containerName="extract-content" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.721184 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd25eab-ecb4-4c07-9da3-500645a2f2c3" containerName="extract-content" Feb 24 03:07:27 crc kubenswrapper[4923]: E0224 03:07:27.721199 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd25eab-ecb4-4c07-9da3-500645a2f2c3" containerName="registry-server" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.721208 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd25eab-ecb4-4c07-9da3-500645a2f2c3" containerName="registry-server" Feb 24 03:07:27 crc kubenswrapper[4923]: E0224 03:07:27.721227 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd25eab-ecb4-4c07-9da3-500645a2f2c3" containerName="extract-utilities" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.721235 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd25eab-ecb4-4c07-9da3-500645a2f2c3" containerName="extract-utilities" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.721367 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bd25eab-ecb4-4c07-9da3-500645a2f2c3" containerName="registry-server" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.722041 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-mnljj" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.725851 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-mnljj"] Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.728711 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-25qp5"] Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.729369 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-25qp5" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.761688 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.761858 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-6nbwv" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.772361 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-25qp5"] Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.784393 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-fp5bb"] Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.785248 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fp5bb" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.787924 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pslj2\" (UniqueName: \"kubernetes.io/projected/8192c6db-fbcf-45b6-b43c-313abcc10d2e-kube-api-access-pslj2\") pod \"nmstate-webhook-866bcb46dc-25qp5\" (UID: \"8192c6db-fbcf-45b6-b43c-313abcc10d2e\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-25qp5" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.788010 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8192c6db-fbcf-45b6-b43c-313abcc10d2e-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-25qp5\" (UID: \"8192c6db-fbcf-45b6-b43c-313abcc10d2e\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-25qp5" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.788042 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc4l7\" (UniqueName: \"kubernetes.io/projected/ae5580e9-8a55-4dbe-99c8-e21e22d3813e-kube-api-access-kc4l7\") pod \"nmstate-metrics-58c85c668d-mnljj\" (UID: \"ae5580e9-8a55-4dbe-99c8-e21e22d3813e\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-mnljj" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.867624 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bxdm5"] Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.868401 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bxdm5" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.873822 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.873893 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.877969 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bxdm5"] Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.879130 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-xfvs8" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.889631 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/36d48cde-8247-4729-aa2d-d6b99d25b198-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-bxdm5\" (UID: \"36d48cde-8247-4729-aa2d-d6b99d25b198\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bxdm5" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.889684 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/36d48cde-8247-4729-aa2d-d6b99d25b198-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-bxdm5\" (UID: \"36d48cde-8247-4729-aa2d-d6b99d25b198\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bxdm5" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.889708 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pslj2\" (UniqueName: \"kubernetes.io/projected/8192c6db-fbcf-45b6-b43c-313abcc10d2e-kube-api-access-pslj2\") pod \"nmstate-webhook-866bcb46dc-25qp5\" (UID: \"8192c6db-fbcf-45b6-b43c-313abcc10d2e\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-25qp5" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.889731 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/886309d8-6744-4d32-a729-225ef9679579-nmstate-lock\") pod \"nmstate-handler-fp5bb\" (UID: \"886309d8-6744-4d32-a729-225ef9679579\") " pod="openshift-nmstate/nmstate-handler-fp5bb" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.889760 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmjbn\" (UniqueName: \"kubernetes.io/projected/36d48cde-8247-4729-aa2d-d6b99d25b198-kube-api-access-gmjbn\") pod \"nmstate-console-plugin-5c78fc5d65-bxdm5\" (UID: \"36d48cde-8247-4729-aa2d-d6b99d25b198\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bxdm5" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.889782 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/886309d8-6744-4d32-a729-225ef9679579-ovs-socket\") pod \"nmstate-handler-fp5bb\" (UID: \"886309d8-6744-4d32-a729-225ef9679579\") " pod="openshift-nmstate/nmstate-handler-fp5bb" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.889799 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8192c6db-fbcf-45b6-b43c-313abcc10d2e-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-25qp5\" (UID: \"8192c6db-fbcf-45b6-b43c-313abcc10d2e\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-25qp5" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.889820 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc4l7\" (UniqueName: \"kubernetes.io/projected/ae5580e9-8a55-4dbe-99c8-e21e22d3813e-kube-api-access-kc4l7\") pod \"nmstate-metrics-58c85c668d-mnljj\" (UID: \"ae5580e9-8a55-4dbe-99c8-e21e22d3813e\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-mnljj" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.889838 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zwq2\" (UniqueName: \"kubernetes.io/projected/886309d8-6744-4d32-a729-225ef9679579-kube-api-access-8zwq2\") pod \"nmstate-handler-fp5bb\" (UID: \"886309d8-6744-4d32-a729-225ef9679579\") " pod="openshift-nmstate/nmstate-handler-fp5bb" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.889862 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/886309d8-6744-4d32-a729-225ef9679579-dbus-socket\") pod \"nmstate-handler-fp5bb\" (UID: \"886309d8-6744-4d32-a729-225ef9679579\") " pod="openshift-nmstate/nmstate-handler-fp5bb" Feb 24 03:07:27 crc kubenswrapper[4923]: E0224 03:07:27.890185 4923 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 24 03:07:27 crc kubenswrapper[4923]: E0224 03:07:27.890229 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8192c6db-fbcf-45b6-b43c-313abcc10d2e-tls-key-pair podName:8192c6db-fbcf-45b6-b43c-313abcc10d2e nodeName:}" failed. No retries permitted until 2026-02-24 03:07:28.390215923 +0000 UTC m=+772.407286736 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/8192c6db-fbcf-45b6-b43c-313abcc10d2e-tls-key-pair") pod "nmstate-webhook-866bcb46dc-25qp5" (UID: "8192c6db-fbcf-45b6-b43c-313abcc10d2e") : secret "openshift-nmstate-webhook" not found Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.911511 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pslj2\" (UniqueName: \"kubernetes.io/projected/8192c6db-fbcf-45b6-b43c-313abcc10d2e-kube-api-access-pslj2\") pod \"nmstate-webhook-866bcb46dc-25qp5\" (UID: \"8192c6db-fbcf-45b6-b43c-313abcc10d2e\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-25qp5" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.912887 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc4l7\" (UniqueName: \"kubernetes.io/projected/ae5580e9-8a55-4dbe-99c8-e21e22d3813e-kube-api-access-kc4l7\") pod \"nmstate-metrics-58c85c668d-mnljj\" (UID: \"ae5580e9-8a55-4dbe-99c8-e21e22d3813e\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-mnljj" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.990216 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/886309d8-6744-4d32-a729-225ef9679579-dbus-socket\") pod \"nmstate-handler-fp5bb\" (UID: \"886309d8-6744-4d32-a729-225ef9679579\") " pod="openshift-nmstate/nmstate-handler-fp5bb" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.990259 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/36d48cde-8247-4729-aa2d-d6b99d25b198-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-bxdm5\" (UID: \"36d48cde-8247-4729-aa2d-d6b99d25b198\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bxdm5" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.990307 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/36d48cde-8247-4729-aa2d-d6b99d25b198-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-bxdm5\" (UID: \"36d48cde-8247-4729-aa2d-d6b99d25b198\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bxdm5" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.990333 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/886309d8-6744-4d32-a729-225ef9679579-nmstate-lock\") pod \"nmstate-handler-fp5bb\" (UID: \"886309d8-6744-4d32-a729-225ef9679579\") " pod="openshift-nmstate/nmstate-handler-fp5bb" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.990362 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmjbn\" (UniqueName: \"kubernetes.io/projected/36d48cde-8247-4729-aa2d-d6b99d25b198-kube-api-access-gmjbn\") pod \"nmstate-console-plugin-5c78fc5d65-bxdm5\" (UID: \"36d48cde-8247-4729-aa2d-d6b99d25b198\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bxdm5" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.990382 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/886309d8-6744-4d32-a729-225ef9679579-ovs-socket\") pod \"nmstate-handler-fp5bb\" (UID: \"886309d8-6744-4d32-a729-225ef9679579\") " pod="openshift-nmstate/nmstate-handler-fp5bb" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.990411 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zwq2\" (UniqueName: \"kubernetes.io/projected/886309d8-6744-4d32-a729-225ef9679579-kube-api-access-8zwq2\") pod \"nmstate-handler-fp5bb\" (UID: \"886309d8-6744-4d32-a729-225ef9679579\") " pod="openshift-nmstate/nmstate-handler-fp5bb" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.990912 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/886309d8-6744-4d32-a729-225ef9679579-dbus-socket\") pod \"nmstate-handler-fp5bb\" (UID: \"886309d8-6744-4d32-a729-225ef9679579\") " pod="openshift-nmstate/nmstate-handler-fp5bb" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.990952 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/886309d8-6744-4d32-a729-225ef9679579-nmstate-lock\") pod \"nmstate-handler-fp5bb\" (UID: \"886309d8-6744-4d32-a729-225ef9679579\") " pod="openshift-nmstate/nmstate-handler-fp5bb" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.991171 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/886309d8-6744-4d32-a729-225ef9679579-ovs-socket\") pod \"nmstate-handler-fp5bb\" (UID: \"886309d8-6744-4d32-a729-225ef9679579\") " pod="openshift-nmstate/nmstate-handler-fp5bb" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.991702 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/36d48cde-8247-4729-aa2d-d6b99d25b198-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-bxdm5\" (UID: \"36d48cde-8247-4729-aa2d-d6b99d25b198\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bxdm5" Feb 24 03:07:27 crc kubenswrapper[4923]: I0224 03:07:27.994901 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/36d48cde-8247-4729-aa2d-d6b99d25b198-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-bxdm5\" (UID: \"36d48cde-8247-4729-aa2d-d6b99d25b198\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bxdm5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.023109 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmjbn\" (UniqueName: \"kubernetes.io/projected/36d48cde-8247-4729-aa2d-d6b99d25b198-kube-api-access-gmjbn\") pod \"nmstate-console-plugin-5c78fc5d65-bxdm5\" (UID: \"36d48cde-8247-4729-aa2d-d6b99d25b198\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bxdm5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.023657 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zwq2\" (UniqueName: \"kubernetes.io/projected/886309d8-6744-4d32-a729-225ef9679579-kube-api-access-8zwq2\") pod \"nmstate-handler-fp5bb\" (UID: \"886309d8-6744-4d32-a729-225ef9679579\") " pod="openshift-nmstate/nmstate-handler-fp5bb" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.061364 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d9c8895d5-4n6k5"] Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.062217 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.079288 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d9c8895d5-4n6k5"] Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.089066 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-mnljj" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.114833 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fp5bb" Feb 24 03:07:28 crc kubenswrapper[4923]: W0224 03:07:28.140128 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod886309d8_6744_4d32_a729_225ef9679579.slice/crio-d6b370076f83a60eeb70b31a51385de4db58d286763748469d1d612956aba45b WatchSource:0}: Error finding container d6b370076f83a60eeb70b31a51385de4db58d286763748469d1d612956aba45b: Status 404 returned error can't find the container with id d6b370076f83a60eeb70b31a51385de4db58d286763748469d1d612956aba45b Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.180029 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bxdm5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.194622 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-console-config\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.194701 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8rzv\" (UniqueName: \"kubernetes.io/projected/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-kube-api-access-q8rzv\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.194733 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-console-oauth-config\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.194754 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-trusted-ca-bundle\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.194771 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-service-ca\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.194791 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-console-serving-cert\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.194817 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-oauth-serving-cert\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.295714 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-console-serving-cert\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.295753 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-oauth-serving-cert\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.295778 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-console-config\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.295868 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8rzv\" (UniqueName: \"kubernetes.io/projected/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-kube-api-access-q8rzv\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.296554 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-console-config\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.296605 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-console-oauth-config\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.296629 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-trusted-ca-bundle\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.296648 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-service-ca\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.296706 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-oauth-serving-cert\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.297225 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-service-ca\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.298042 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-trusted-ca-bundle\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.301628 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-console-serving-cert\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.302919 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-console-oauth-config\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.311167 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8rzv\" (UniqueName: \"kubernetes.io/projected/07833ade-07a1-4d62-ba3d-bd0f7e9ca31b-kube-api-access-q8rzv\") pod \"console-6d9c8895d5-4n6k5\" (UID: \"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b\") " pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.351344 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bxdm5"] Feb 24 03:07:28 crc kubenswrapper[4923]: W0224 03:07:28.356209 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36d48cde_8247_4729_aa2d_d6b99d25b198.slice/crio-b413f3b4f9a063a91fce0bedd1e23c7217be4a4b0196191a1732ad294f292634 WatchSource:0}: Error finding container b413f3b4f9a063a91fce0bedd1e23c7217be4a4b0196191a1732ad294f292634: Status 404 returned error can't find the container with id b413f3b4f9a063a91fce0bedd1e23c7217be4a4b0196191a1732ad294f292634 Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.376480 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.397420 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8192c6db-fbcf-45b6-b43c-313abcc10d2e-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-25qp5\" (UID: \"8192c6db-fbcf-45b6-b43c-313abcc10d2e\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-25qp5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.401108 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8192c6db-fbcf-45b6-b43c-313abcc10d2e-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-25qp5\" (UID: \"8192c6db-fbcf-45b6-b43c-313abcc10d2e\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-25qp5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.403654 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-25qp5" Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.479246 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bxdm5" event={"ID":"36d48cde-8247-4729-aa2d-d6b99d25b198","Type":"ContainerStarted","Data":"b413f3b4f9a063a91fce0bedd1e23c7217be4a4b0196191a1732ad294f292634"} Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.480174 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fp5bb" event={"ID":"886309d8-6744-4d32-a729-225ef9679579","Type":"ContainerStarted","Data":"d6b370076f83a60eeb70b31a51385de4db58d286763748469d1d612956aba45b"} Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.484617 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-mnljj"] Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.779126 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d9c8895d5-4n6k5"] Feb 24 03:07:28 crc kubenswrapper[4923]: W0224 03:07:28.784392 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07833ade_07a1_4d62_ba3d_bd0f7e9ca31b.slice/crio-f01942b5e340e1febae8af3a076b0674742ac7906b8721d664f2f0de865dfc25 WatchSource:0}: Error finding container f01942b5e340e1febae8af3a076b0674742ac7906b8721d664f2f0de865dfc25: Status 404 returned error can't find the container with id f01942b5e340e1febae8af3a076b0674742ac7906b8721d664f2f0de865dfc25 Feb 24 03:07:28 crc kubenswrapper[4923]: I0224 03:07:28.833508 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-25qp5"] Feb 24 03:07:28 crc kubenswrapper[4923]: W0224 03:07:28.839858 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8192c6db_fbcf_45b6_b43c_313abcc10d2e.slice/crio-119b06a104e604f2446c7700a49453a9224ce48afeaca4c321df2854780cf24b WatchSource:0}: Error finding container 119b06a104e604f2446c7700a49453a9224ce48afeaca4c321df2854780cf24b: Status 404 returned error can't find the container with id 119b06a104e604f2446c7700a49453a9224ce48afeaca4c321df2854780cf24b Feb 24 03:07:29 crc kubenswrapper[4923]: I0224 03:07:29.490261 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d9c8895d5-4n6k5" event={"ID":"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b","Type":"ContainerStarted","Data":"c4a020a91c31216ee7deca49b033714464e8b381b131325c4af151fa2d8ea3f4"} Feb 24 03:07:29 crc kubenswrapper[4923]: I0224 03:07:29.491486 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d9c8895d5-4n6k5" event={"ID":"07833ade-07a1-4d62-ba3d-bd0f7e9ca31b","Type":"ContainerStarted","Data":"f01942b5e340e1febae8af3a076b0674742ac7906b8721d664f2f0de865dfc25"} Feb 24 03:07:29 crc kubenswrapper[4923]: I0224 03:07:29.500003 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-25qp5" event={"ID":"8192c6db-fbcf-45b6-b43c-313abcc10d2e","Type":"ContainerStarted","Data":"119b06a104e604f2446c7700a49453a9224ce48afeaca4c321df2854780cf24b"} Feb 24 03:07:29 crc kubenswrapper[4923]: I0224 03:07:29.502092 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-mnljj" event={"ID":"ae5580e9-8a55-4dbe-99c8-e21e22d3813e","Type":"ContainerStarted","Data":"cbf7db4cc1eb4a344f920a799aea9dde86a1bc685dc41bf30518973160aa4429"} Feb 24 03:07:29 crc kubenswrapper[4923]: I0224 03:07:29.512835 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d9c8895d5-4n6k5" podStartSLOduration=1.512817961 podStartE2EDuration="1.512817961s" podCreationTimestamp="2026-02-24 03:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:07:29.509451367 +0000 UTC m=+773.526522180" watchObservedRunningTime="2026-02-24 03:07:29.512817961 +0000 UTC m=+773.529888774" Feb 24 03:07:31 crc kubenswrapper[4923]: I0224 03:07:31.516010 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-25qp5" event={"ID":"8192c6db-fbcf-45b6-b43c-313abcc10d2e","Type":"ContainerStarted","Data":"6f4bd91190dd2eaff2620a25f0f0cc3056bc9689b7661205de893f2d8accf74f"} Feb 24 03:07:31 crc kubenswrapper[4923]: I0224 03:07:31.517934 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-25qp5" Feb 24 03:07:31 crc kubenswrapper[4923]: I0224 03:07:31.519865 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-mnljj" event={"ID":"ae5580e9-8a55-4dbe-99c8-e21e22d3813e","Type":"ContainerStarted","Data":"3019c2c245247c3ce76ad77b8684a7b90c87bd4f7d73780c8a038ed4628593b5"} Feb 24 03:07:31 crc kubenswrapper[4923]: I0224 03:07:31.521711 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fp5bb" event={"ID":"886309d8-6744-4d32-a729-225ef9679579","Type":"ContainerStarted","Data":"69a239ee20cddc622a1994e9e2c07c9b137f5e89109d2ea91ee152b9ab59b92e"} Feb 24 03:07:31 crc kubenswrapper[4923]: I0224 03:07:31.522254 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-fp5bb" Feb 24 03:07:31 crc kubenswrapper[4923]: I0224 03:07:31.523945 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bxdm5" event={"ID":"36d48cde-8247-4729-aa2d-d6b99d25b198","Type":"ContainerStarted","Data":"ebd24e04ba5d7f1dd381da5094d244bca19f44387bd4a92245c7920af6742f96"} Feb 24 03:07:31 crc kubenswrapper[4923]: I0224 03:07:31.539518 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-25qp5" podStartSLOduration=2.349561508 podStartE2EDuration="4.539498734s" podCreationTimestamp="2026-02-24 03:07:27 +0000 UTC" firstStartedPulling="2026-02-24 03:07:28.842067617 +0000 UTC m=+772.859138430" lastFinishedPulling="2026-02-24 03:07:31.032004843 +0000 UTC m=+775.049075656" observedRunningTime="2026-02-24 03:07:31.536725785 +0000 UTC m=+775.553796608" watchObservedRunningTime="2026-02-24 03:07:31.539498734 +0000 UTC m=+775.556569557" Feb 24 03:07:31 crc kubenswrapper[4923]: I0224 03:07:31.563404 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-bxdm5" podStartSLOduration=1.90060939 podStartE2EDuration="4.563377301s" podCreationTimestamp="2026-02-24 03:07:27 +0000 UTC" firstStartedPulling="2026-02-24 03:07:28.358244357 +0000 UTC m=+772.375315170" lastFinishedPulling="2026-02-24 03:07:31.021012228 +0000 UTC m=+775.038083081" observedRunningTime="2026-02-24 03:07:31.560288734 +0000 UTC m=+775.577359547" watchObservedRunningTime="2026-02-24 03:07:31.563377301 +0000 UTC m=+775.580448114" Feb 24 03:07:31 crc kubenswrapper[4923]: I0224 03:07:31.579061 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-fp5bb" podStartSLOduration=1.701225725 podStartE2EDuration="4.579040253s" podCreationTimestamp="2026-02-24 03:07:27 +0000 UTC" firstStartedPulling="2026-02-24 03:07:28.142267076 +0000 UTC m=+772.159337889" lastFinishedPulling="2026-02-24 03:07:31.020081604 +0000 UTC m=+775.037152417" observedRunningTime="2026-02-24 03:07:31.574434658 +0000 UTC m=+775.591505471" watchObservedRunningTime="2026-02-24 03:07:31.579040253 +0000 UTC m=+775.596111066" Feb 24 03:07:33 crc kubenswrapper[4923]: I0224 03:07:33.538600 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-mnljj" event={"ID":"ae5580e9-8a55-4dbe-99c8-e21e22d3813e","Type":"ContainerStarted","Data":"5dbc3c3c55bebb04bb6bb73b8cc801b78ece4b6817cdfee4e9fce9d94b9fd5f7"} Feb 24 03:07:33 crc kubenswrapper[4923]: I0224 03:07:33.563700 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-mnljj" podStartSLOduration=1.678167219 podStartE2EDuration="6.563677425s" podCreationTimestamp="2026-02-24 03:07:27 +0000 UTC" firstStartedPulling="2026-02-24 03:07:28.497978782 +0000 UTC m=+772.515049595" lastFinishedPulling="2026-02-24 03:07:33.383488988 +0000 UTC m=+777.400559801" observedRunningTime="2026-02-24 03:07:33.561162562 +0000 UTC m=+777.578233385" watchObservedRunningTime="2026-02-24 03:07:33.563677425 +0000 UTC m=+777.580748248" Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.145350 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-fp5bb" Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.157078 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c6df6"] Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.160347 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c6df6" Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.188373 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c6df6"] Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.228819 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbqxr\" (UniqueName: \"kubernetes.io/projected/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38-kube-api-access-qbqxr\") pod \"community-operators-c6df6\" (UID: \"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38\") " pod="openshift-marketplace/community-operators-c6df6" Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.228881 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38-catalog-content\") pod \"community-operators-c6df6\" (UID: \"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38\") " pod="openshift-marketplace/community-operators-c6df6" Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.228905 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38-utilities\") pod \"community-operators-c6df6\" (UID: \"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38\") " pod="openshift-marketplace/community-operators-c6df6" Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.330089 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38-catalog-content\") pod \"community-operators-c6df6\" (UID: \"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38\") " pod="openshift-marketplace/community-operators-c6df6" Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.330136 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38-utilities\") pod \"community-operators-c6df6\" (UID: \"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38\") " pod="openshift-marketplace/community-operators-c6df6" Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.330225 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbqxr\" (UniqueName: \"kubernetes.io/projected/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38-kube-api-access-qbqxr\") pod \"community-operators-c6df6\" (UID: \"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38\") " pod="openshift-marketplace/community-operators-c6df6" Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.330859 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38-catalog-content\") pod \"community-operators-c6df6\" (UID: \"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38\") " pod="openshift-marketplace/community-operators-c6df6" Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.331062 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38-utilities\") pod \"community-operators-c6df6\" (UID: \"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38\") " pod="openshift-marketplace/community-operators-c6df6" Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.351665 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbqxr\" (UniqueName: \"kubernetes.io/projected/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38-kube-api-access-qbqxr\") pod \"community-operators-c6df6\" (UID: \"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38\") " pod="openshift-marketplace/community-operators-c6df6" Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.376576 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.376763 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.383321 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.519860 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c6df6" Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.592026 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d9c8895d5-4n6k5" Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.653349 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-w5k6j"] Feb 24 03:07:38 crc kubenswrapper[4923]: I0224 03:07:38.882806 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c6df6"] Feb 24 03:07:39 crc kubenswrapper[4923]: I0224 03:07:39.591809 4923 generic.go:334] "Generic (PLEG): container finished" podID="6dfb3992-e58e-4ebe-9b37-4bb9f1763a38" containerID="d63fc6ea364b50d66d37162e62942b53cb7bab769e099b84d13d930a1736beba" exitCode=0 Feb 24 03:07:39 crc kubenswrapper[4923]: I0224 03:07:39.591878 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6df6" event={"ID":"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38","Type":"ContainerDied","Data":"d63fc6ea364b50d66d37162e62942b53cb7bab769e099b84d13d930a1736beba"} Feb 24 03:07:39 crc kubenswrapper[4923]: I0224 03:07:39.591939 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6df6" event={"ID":"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38","Type":"ContainerStarted","Data":"da31d66eff543d53bff90707cb35978411c2382abf3bf8888452a7dd262df7e8"} Feb 24 03:07:40 crc kubenswrapper[4923]: I0224 03:07:40.598184 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6df6" event={"ID":"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38","Type":"ContainerStarted","Data":"bda636fba5dc4e82c5072c2499c1c5799401a557d11f746a59519b1cd63cce71"} Feb 24 03:07:41 crc kubenswrapper[4923]: I0224 03:07:41.607396 4923 generic.go:334] "Generic (PLEG): container finished" podID="6dfb3992-e58e-4ebe-9b37-4bb9f1763a38" containerID="bda636fba5dc4e82c5072c2499c1c5799401a557d11f746a59519b1cd63cce71" exitCode=0 Feb 24 03:07:41 crc kubenswrapper[4923]: I0224 03:07:41.607461 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6df6" event={"ID":"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38","Type":"ContainerDied","Data":"bda636fba5dc4e82c5072c2499c1c5799401a557d11f746a59519b1cd63cce71"} Feb 24 03:07:42 crc kubenswrapper[4923]: I0224 03:07:42.618370 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6df6" event={"ID":"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38","Type":"ContainerStarted","Data":"5131905ec70ffd9c6835477b1371472eccad685adff6a608aaf8d5a993bba213"} Feb 24 03:07:42 crc kubenswrapper[4923]: I0224 03:07:42.651447 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c6df6" podStartSLOduration=2.19213775 podStartE2EDuration="4.651426632s" podCreationTimestamp="2026-02-24 03:07:38 +0000 UTC" firstStartedPulling="2026-02-24 03:07:39.593820317 +0000 UTC m=+783.610891170" lastFinishedPulling="2026-02-24 03:07:42.053109199 +0000 UTC m=+786.070180052" observedRunningTime="2026-02-24 03:07:42.645953415 +0000 UTC m=+786.663024248" watchObservedRunningTime="2026-02-24 03:07:42.651426632 +0000 UTC m=+786.668497455" Feb 24 03:07:48 crc kubenswrapper[4923]: I0224 03:07:48.413100 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-25qp5" Feb 24 03:07:48 crc kubenswrapper[4923]: I0224 03:07:48.521243 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c6df6" Feb 24 03:07:48 crc kubenswrapper[4923]: I0224 03:07:48.521627 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c6df6" Feb 24 03:07:48 crc kubenswrapper[4923]: I0224 03:07:48.572284 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c6df6" Feb 24 03:07:48 crc kubenswrapper[4923]: I0224 03:07:48.731986 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c6df6" Feb 24 03:07:48 crc kubenswrapper[4923]: I0224 03:07:48.805492 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c6df6"] Feb 24 03:07:50 crc kubenswrapper[4923]: I0224 03:07:50.671044 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c6df6" podUID="6dfb3992-e58e-4ebe-9b37-4bb9f1763a38" containerName="registry-server" containerID="cri-o://5131905ec70ffd9c6835477b1371472eccad685adff6a608aaf8d5a993bba213" gracePeriod=2 Feb 24 03:07:51 crc kubenswrapper[4923]: I0224 03:07:51.677279 4923 generic.go:334] "Generic (PLEG): container finished" podID="6dfb3992-e58e-4ebe-9b37-4bb9f1763a38" containerID="5131905ec70ffd9c6835477b1371472eccad685adff6a608aaf8d5a993bba213" exitCode=0 Feb 24 03:07:51 crc kubenswrapper[4923]: I0224 03:07:51.677341 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6df6" event={"ID":"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38","Type":"ContainerDied","Data":"5131905ec70ffd9c6835477b1371472eccad685adff6a608aaf8d5a993bba213"} Feb 24 03:07:52 crc kubenswrapper[4923]: I0224 03:07:52.686737 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6df6" event={"ID":"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38","Type":"ContainerDied","Data":"da31d66eff543d53bff90707cb35978411c2382abf3bf8888452a7dd262df7e8"} Feb 24 03:07:52 crc kubenswrapper[4923]: I0224 03:07:52.686786 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da31d66eff543d53bff90707cb35978411c2382abf3bf8888452a7dd262df7e8" Feb 24 03:07:52 crc kubenswrapper[4923]: I0224 03:07:52.701270 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c6df6" Feb 24 03:07:52 crc kubenswrapper[4923]: I0224 03:07:52.873026 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38-utilities\") pod \"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38\" (UID: \"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38\") " Feb 24 03:07:52 crc kubenswrapper[4923]: I0224 03:07:52.873147 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbqxr\" (UniqueName: \"kubernetes.io/projected/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38-kube-api-access-qbqxr\") pod \"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38\" (UID: \"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38\") " Feb 24 03:07:52 crc kubenswrapper[4923]: I0224 03:07:52.873181 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38-catalog-content\") pod \"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38\" (UID: \"6dfb3992-e58e-4ebe-9b37-4bb9f1763a38\") " Feb 24 03:07:52 crc kubenswrapper[4923]: I0224 03:07:52.874451 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38-utilities" (OuterVolumeSpecName: "utilities") pod "6dfb3992-e58e-4ebe-9b37-4bb9f1763a38" (UID: "6dfb3992-e58e-4ebe-9b37-4bb9f1763a38"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:07:52 crc kubenswrapper[4923]: I0224 03:07:52.881755 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38-kube-api-access-qbqxr" (OuterVolumeSpecName: "kube-api-access-qbqxr") pod "6dfb3992-e58e-4ebe-9b37-4bb9f1763a38" (UID: "6dfb3992-e58e-4ebe-9b37-4bb9f1763a38"). InnerVolumeSpecName "kube-api-access-qbqxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:07:52 crc kubenswrapper[4923]: I0224 03:07:52.940421 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6dfb3992-e58e-4ebe-9b37-4bb9f1763a38" (UID: "6dfb3992-e58e-4ebe-9b37-4bb9f1763a38"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:07:52 crc kubenswrapper[4923]: I0224 03:07:52.975741 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbqxr\" (UniqueName: \"kubernetes.io/projected/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38-kube-api-access-qbqxr\") on node \"crc\" DevicePath \"\"" Feb 24 03:07:52 crc kubenswrapper[4923]: I0224 03:07:52.975875 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:07:52 crc kubenswrapper[4923]: I0224 03:07:52.975907 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:07:53 crc kubenswrapper[4923]: I0224 03:07:53.693483 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c6df6" Feb 24 03:07:53 crc kubenswrapper[4923]: I0224 03:07:53.746778 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c6df6"] Feb 24 03:07:53 crc kubenswrapper[4923]: I0224 03:07:53.752634 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c6df6"] Feb 24 03:07:55 crc kubenswrapper[4923]: I0224 03:07:55.724719 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dfb3992-e58e-4ebe-9b37-4bb9f1763a38" path="/var/lib/kubelet/pods/6dfb3992-e58e-4ebe-9b37-4bb9f1763a38/volumes" Feb 24 03:08:01 crc kubenswrapper[4923]: I0224 03:08:01.847704 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5"] Feb 24 03:08:01 crc kubenswrapper[4923]: E0224 03:08:01.848784 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfb3992-e58e-4ebe-9b37-4bb9f1763a38" containerName="registry-server" Feb 24 03:08:01 crc kubenswrapper[4923]: I0224 03:08:01.848807 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfb3992-e58e-4ebe-9b37-4bb9f1763a38" containerName="registry-server" Feb 24 03:08:01 crc kubenswrapper[4923]: E0224 03:08:01.848829 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfb3992-e58e-4ebe-9b37-4bb9f1763a38" containerName="extract-content" Feb 24 03:08:01 crc kubenswrapper[4923]: I0224 03:08:01.848842 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfb3992-e58e-4ebe-9b37-4bb9f1763a38" containerName="extract-content" Feb 24 03:08:01 crc kubenswrapper[4923]: E0224 03:08:01.848868 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfb3992-e58e-4ebe-9b37-4bb9f1763a38" containerName="extract-utilities" Feb 24 03:08:01 crc kubenswrapper[4923]: I0224 03:08:01.848880 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfb3992-e58e-4ebe-9b37-4bb9f1763a38" containerName="extract-utilities" Feb 24 03:08:01 crc kubenswrapper[4923]: I0224 03:08:01.849075 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dfb3992-e58e-4ebe-9b37-4bb9f1763a38" containerName="registry-server" Feb 24 03:08:01 crc kubenswrapper[4923]: I0224 03:08:01.850499 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5" Feb 24 03:08:01 crc kubenswrapper[4923]: I0224 03:08:01.852649 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 24 03:08:01 crc kubenswrapper[4923]: I0224 03:08:01.861439 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5"] Feb 24 03:08:01 crc kubenswrapper[4923]: I0224 03:08:01.903503 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69c2b6f1-7455-4c00-a61e-43ab85b9df97-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5\" (UID: \"69c2b6f1-7455-4c00-a61e-43ab85b9df97\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5" Feb 24 03:08:01 crc kubenswrapper[4923]: I0224 03:08:01.903588 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zsp7\" (UniqueName: \"kubernetes.io/projected/69c2b6f1-7455-4c00-a61e-43ab85b9df97-kube-api-access-2zsp7\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5\" (UID: \"69c2b6f1-7455-4c00-a61e-43ab85b9df97\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5" Feb 24 03:08:01 crc kubenswrapper[4923]: I0224 03:08:01.903631 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69c2b6f1-7455-4c00-a61e-43ab85b9df97-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5\" (UID: \"69c2b6f1-7455-4c00-a61e-43ab85b9df97\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5" Feb 24 03:08:02 crc kubenswrapper[4923]: I0224 03:08:02.004452 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69c2b6f1-7455-4c00-a61e-43ab85b9df97-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5\" (UID: \"69c2b6f1-7455-4c00-a61e-43ab85b9df97\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5" Feb 24 03:08:02 crc kubenswrapper[4923]: I0224 03:08:02.004631 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69c2b6f1-7455-4c00-a61e-43ab85b9df97-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5\" (UID: \"69c2b6f1-7455-4c00-a61e-43ab85b9df97\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5" Feb 24 03:08:02 crc kubenswrapper[4923]: I0224 03:08:02.004690 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zsp7\" (UniqueName: \"kubernetes.io/projected/69c2b6f1-7455-4c00-a61e-43ab85b9df97-kube-api-access-2zsp7\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5\" (UID: \"69c2b6f1-7455-4c00-a61e-43ab85b9df97\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5" Feb 24 03:08:02 crc kubenswrapper[4923]: I0224 03:08:02.005287 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69c2b6f1-7455-4c00-a61e-43ab85b9df97-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5\" (UID: \"69c2b6f1-7455-4c00-a61e-43ab85b9df97\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5" Feb 24 03:08:02 crc kubenswrapper[4923]: I0224 03:08:02.005348 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69c2b6f1-7455-4c00-a61e-43ab85b9df97-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5\" (UID: \"69c2b6f1-7455-4c00-a61e-43ab85b9df97\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5" Feb 24 03:08:02 crc kubenswrapper[4923]: I0224 03:08:02.030001 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zsp7\" (UniqueName: \"kubernetes.io/projected/69c2b6f1-7455-4c00-a61e-43ab85b9df97-kube-api-access-2zsp7\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5\" (UID: \"69c2b6f1-7455-4c00-a61e-43ab85b9df97\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5" Feb 24 03:08:02 crc kubenswrapper[4923]: I0224 03:08:02.169753 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5" Feb 24 03:08:02 crc kubenswrapper[4923]: I0224 03:08:02.373358 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5"] Feb 24 03:08:02 crc kubenswrapper[4923]: I0224 03:08:02.751039 4923 generic.go:334] "Generic (PLEG): container finished" podID="69c2b6f1-7455-4c00-a61e-43ab85b9df97" containerID="d48968a41e98d5af21edeb5127573fe8ff5bc0b0b142c4be018b608b6b55be63" exitCode=0 Feb 24 03:08:02 crc kubenswrapper[4923]: I0224 03:08:02.751110 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5" event={"ID":"69c2b6f1-7455-4c00-a61e-43ab85b9df97","Type":"ContainerDied","Data":"d48968a41e98d5af21edeb5127573fe8ff5bc0b0b142c4be018b608b6b55be63"} Feb 24 03:08:02 crc kubenswrapper[4923]: I0224 03:08:02.751154 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5" event={"ID":"69c2b6f1-7455-4c00-a61e-43ab85b9df97","Type":"ContainerStarted","Data":"f220dd8a34cf69a499cd75cb20465590a2bff4faba469df4f9ae94f906857e23"} Feb 24 03:08:03 crc kubenswrapper[4923]: I0224 03:08:03.689097 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-w5k6j" podUID="3f89d640-5e7f-473b-98e3-420780c10024" containerName="console" containerID="cri-o://66bfd904fad9ad79d2916e20c33d6162f4ff403887d264fdaedd1a18493f4a36" gracePeriod=15 Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.151380 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-w5k6j_3f89d640-5e7f-473b-98e3-420780c10024/console/0.log" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.151460 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.233365 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-service-ca\") pod \"3f89d640-5e7f-473b-98e3-420780c10024\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.233770 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f89d640-5e7f-473b-98e3-420780c10024-console-oauth-config\") pod \"3f89d640-5e7f-473b-98e3-420780c10024\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.234124 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f89d640-5e7f-473b-98e3-420780c10024-console-serving-cert\") pod \"3f89d640-5e7f-473b-98e3-420780c10024\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.234361 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdtgs\" (UniqueName: \"kubernetes.io/projected/3f89d640-5e7f-473b-98e3-420780c10024-kube-api-access-cdtgs\") pod \"3f89d640-5e7f-473b-98e3-420780c10024\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.234408 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-trusted-ca-bundle\") pod \"3f89d640-5e7f-473b-98e3-420780c10024\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.234443 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-console-config\") pod \"3f89d640-5e7f-473b-98e3-420780c10024\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.234482 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-oauth-serving-cert\") pod \"3f89d640-5e7f-473b-98e3-420780c10024\" (UID: \"3f89d640-5e7f-473b-98e3-420780c10024\") " Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.234109 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-service-ca" (OuterVolumeSpecName: "service-ca") pod "3f89d640-5e7f-473b-98e3-420780c10024" (UID: "3f89d640-5e7f-473b-98e3-420780c10024"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.234743 4923 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.235151 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3f89d640-5e7f-473b-98e3-420780c10024" (UID: "3f89d640-5e7f-473b-98e3-420780c10024"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.235326 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-console-config" (OuterVolumeSpecName: "console-config") pod "3f89d640-5e7f-473b-98e3-420780c10024" (UID: "3f89d640-5e7f-473b-98e3-420780c10024"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.235363 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3f89d640-5e7f-473b-98e3-420780c10024" (UID: "3f89d640-5e7f-473b-98e3-420780c10024"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.240218 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f89d640-5e7f-473b-98e3-420780c10024-kube-api-access-cdtgs" (OuterVolumeSpecName: "kube-api-access-cdtgs") pod "3f89d640-5e7f-473b-98e3-420780c10024" (UID: "3f89d640-5e7f-473b-98e3-420780c10024"). InnerVolumeSpecName "kube-api-access-cdtgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.240250 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f89d640-5e7f-473b-98e3-420780c10024-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3f89d640-5e7f-473b-98e3-420780c10024" (UID: "3f89d640-5e7f-473b-98e3-420780c10024"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.240495 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f89d640-5e7f-473b-98e3-420780c10024-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3f89d640-5e7f-473b-98e3-420780c10024" (UID: "3f89d640-5e7f-473b-98e3-420780c10024"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.335879 4923 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-console-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.336089 4923 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.336213 4923 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f89d640-5e7f-473b-98e3-420780c10024-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.336360 4923 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f89d640-5e7f-473b-98e3-420780c10024-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.336499 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdtgs\" (UniqueName: \"kubernetes.io/projected/3f89d640-5e7f-473b-98e3-420780c10024-kube-api-access-cdtgs\") on node \"crc\" DevicePath \"\"" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.336581 4923 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f89d640-5e7f-473b-98e3-420780c10024-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.772193 4923 generic.go:334] "Generic (PLEG): container finished" podID="69c2b6f1-7455-4c00-a61e-43ab85b9df97" containerID="1dd5c2a90d39f2791e0110dceea99b628f8f53404b1ee9a8ec7018bb047ffdc5" exitCode=0 Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.772361 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5" event={"ID":"69c2b6f1-7455-4c00-a61e-43ab85b9df97","Type":"ContainerDied","Data":"1dd5c2a90d39f2791e0110dceea99b628f8f53404b1ee9a8ec7018bb047ffdc5"} Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.779047 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-w5k6j_3f89d640-5e7f-473b-98e3-420780c10024/console/0.log" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.779166 4923 generic.go:334] "Generic (PLEG): container finished" podID="3f89d640-5e7f-473b-98e3-420780c10024" containerID="66bfd904fad9ad79d2916e20c33d6162f4ff403887d264fdaedd1a18493f4a36" exitCode=2 Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.779208 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-w5k6j" event={"ID":"3f89d640-5e7f-473b-98e3-420780c10024","Type":"ContainerDied","Data":"66bfd904fad9ad79d2916e20c33d6162f4ff403887d264fdaedd1a18493f4a36"} Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.779244 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-w5k6j" event={"ID":"3f89d640-5e7f-473b-98e3-420780c10024","Type":"ContainerDied","Data":"97b420f6958520ae51fb9f650c82142cf47dc5d5114f60d98b53ae09964f4031"} Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.779278 4923 scope.go:117] "RemoveContainer" containerID="66bfd904fad9ad79d2916e20c33d6162f4ff403887d264fdaedd1a18493f4a36" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.779490 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-w5k6j" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.819265 4923 scope.go:117] "RemoveContainer" containerID="66bfd904fad9ad79d2916e20c33d6162f4ff403887d264fdaedd1a18493f4a36" Feb 24 03:08:04 crc kubenswrapper[4923]: E0224 03:08:04.819863 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66bfd904fad9ad79d2916e20c33d6162f4ff403887d264fdaedd1a18493f4a36\": container with ID starting with 66bfd904fad9ad79d2916e20c33d6162f4ff403887d264fdaedd1a18493f4a36 not found: ID does not exist" containerID="66bfd904fad9ad79d2916e20c33d6162f4ff403887d264fdaedd1a18493f4a36" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.819927 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66bfd904fad9ad79d2916e20c33d6162f4ff403887d264fdaedd1a18493f4a36"} err="failed to get container status \"66bfd904fad9ad79d2916e20c33d6162f4ff403887d264fdaedd1a18493f4a36\": rpc error: code = NotFound desc = could not find container \"66bfd904fad9ad79d2916e20c33d6162f4ff403887d264fdaedd1a18493f4a36\": container with ID starting with 66bfd904fad9ad79d2916e20c33d6162f4ff403887d264fdaedd1a18493f4a36 not found: ID does not exist" Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.885337 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-w5k6j"] Feb 24 03:08:04 crc kubenswrapper[4923]: I0224 03:08:04.889269 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-w5k6j"] Feb 24 03:08:05 crc kubenswrapper[4923]: I0224 03:08:05.035656 4923 patch_prober.go:28] interesting pod/console-f9d7485db-w5k6j container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 03:08:05 crc kubenswrapper[4923]: I0224 03:08:05.035795 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-w5k6j" podUID="3f89d640-5e7f-473b-98e3-420780c10024" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 03:08:05 crc kubenswrapper[4923]: I0224 03:08:05.723808 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f89d640-5e7f-473b-98e3-420780c10024" path="/var/lib/kubelet/pods/3f89d640-5e7f-473b-98e3-420780c10024/volumes" Feb 24 03:08:05 crc kubenswrapper[4923]: I0224 03:08:05.787857 4923 generic.go:334] "Generic (PLEG): container finished" podID="69c2b6f1-7455-4c00-a61e-43ab85b9df97" containerID="507eec31d12b66a8676699673e09a0c89d30826ff2f0078c3ae27cdb43e5b675" exitCode=0 Feb 24 03:08:05 crc kubenswrapper[4923]: I0224 03:08:05.787963 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5" event={"ID":"69c2b6f1-7455-4c00-a61e-43ab85b9df97","Type":"ContainerDied","Data":"507eec31d12b66a8676699673e09a0c89d30826ff2f0078c3ae27cdb43e5b675"} Feb 24 03:08:07 crc kubenswrapper[4923]: I0224 03:08:07.064852 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5" Feb 24 03:08:07 crc kubenswrapper[4923]: I0224 03:08:07.171521 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zsp7\" (UniqueName: \"kubernetes.io/projected/69c2b6f1-7455-4c00-a61e-43ab85b9df97-kube-api-access-2zsp7\") pod \"69c2b6f1-7455-4c00-a61e-43ab85b9df97\" (UID: \"69c2b6f1-7455-4c00-a61e-43ab85b9df97\") " Feb 24 03:08:07 crc kubenswrapper[4923]: I0224 03:08:07.171973 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69c2b6f1-7455-4c00-a61e-43ab85b9df97-bundle\") pod \"69c2b6f1-7455-4c00-a61e-43ab85b9df97\" (UID: \"69c2b6f1-7455-4c00-a61e-43ab85b9df97\") " Feb 24 03:08:07 crc kubenswrapper[4923]: I0224 03:08:07.172015 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69c2b6f1-7455-4c00-a61e-43ab85b9df97-util\") pod \"69c2b6f1-7455-4c00-a61e-43ab85b9df97\" (UID: \"69c2b6f1-7455-4c00-a61e-43ab85b9df97\") " Feb 24 03:08:07 crc kubenswrapper[4923]: I0224 03:08:07.173152 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69c2b6f1-7455-4c00-a61e-43ab85b9df97-bundle" (OuterVolumeSpecName: "bundle") pod "69c2b6f1-7455-4c00-a61e-43ab85b9df97" (UID: "69c2b6f1-7455-4c00-a61e-43ab85b9df97"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:08:07 crc kubenswrapper[4923]: I0224 03:08:07.178492 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69c2b6f1-7455-4c00-a61e-43ab85b9df97-kube-api-access-2zsp7" (OuterVolumeSpecName: "kube-api-access-2zsp7") pod "69c2b6f1-7455-4c00-a61e-43ab85b9df97" (UID: "69c2b6f1-7455-4c00-a61e-43ab85b9df97"). InnerVolumeSpecName "kube-api-access-2zsp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:08:07 crc kubenswrapper[4923]: I0224 03:08:07.201132 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69c2b6f1-7455-4c00-a61e-43ab85b9df97-util" (OuterVolumeSpecName: "util") pod "69c2b6f1-7455-4c00-a61e-43ab85b9df97" (UID: "69c2b6f1-7455-4c00-a61e-43ab85b9df97"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:08:07 crc kubenswrapper[4923]: I0224 03:08:07.273346 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zsp7\" (UniqueName: \"kubernetes.io/projected/69c2b6f1-7455-4c00-a61e-43ab85b9df97-kube-api-access-2zsp7\") on node \"crc\" DevicePath \"\"" Feb 24 03:08:07 crc kubenswrapper[4923]: I0224 03:08:07.273382 4923 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69c2b6f1-7455-4c00-a61e-43ab85b9df97-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:08:07 crc kubenswrapper[4923]: I0224 03:08:07.273397 4923 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69c2b6f1-7455-4c00-a61e-43ab85b9df97-util\") on node \"crc\" DevicePath \"\"" Feb 24 03:08:07 crc kubenswrapper[4923]: I0224 03:08:07.808691 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5" event={"ID":"69c2b6f1-7455-4c00-a61e-43ab85b9df97","Type":"ContainerDied","Data":"f220dd8a34cf69a499cd75cb20465590a2bff4faba469df4f9ae94f906857e23"} Feb 24 03:08:07 crc kubenswrapper[4923]: I0224 03:08:07.808771 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f220dd8a34cf69a499cd75cb20465590a2bff4faba469df4f9ae94f906857e23" Feb 24 03:08:07 crc kubenswrapper[4923]: I0224 03:08:07.808776 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.488212 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq"] Feb 24 03:08:18 crc kubenswrapper[4923]: E0224 03:08:18.488841 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f89d640-5e7f-473b-98e3-420780c10024" containerName="console" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.488853 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f89d640-5e7f-473b-98e3-420780c10024" containerName="console" Feb 24 03:08:18 crc kubenswrapper[4923]: E0224 03:08:18.488867 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c2b6f1-7455-4c00-a61e-43ab85b9df97" containerName="pull" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.488872 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c2b6f1-7455-4c00-a61e-43ab85b9df97" containerName="pull" Feb 24 03:08:18 crc kubenswrapper[4923]: E0224 03:08:18.488888 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c2b6f1-7455-4c00-a61e-43ab85b9df97" containerName="extract" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.488895 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c2b6f1-7455-4c00-a61e-43ab85b9df97" containerName="extract" Feb 24 03:08:18 crc kubenswrapper[4923]: E0224 03:08:18.488910 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c2b6f1-7455-4c00-a61e-43ab85b9df97" containerName="util" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.488918 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c2b6f1-7455-4c00-a61e-43ab85b9df97" containerName="util" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.489014 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f89d640-5e7f-473b-98e3-420780c10024" containerName="console" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.489040 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c2b6f1-7455-4c00-a61e-43ab85b9df97" containerName="extract" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.489423 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.497154 4923 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.497415 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.497434 4923 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-m7d2t" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.497569 4923 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.497647 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.510776 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq"] Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.544516 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/475ff3bc-195d-4768-892d-1c0274b3a25c-apiservice-cert\") pod \"metallb-operator-controller-manager-84dbcb4757-pvzfq\" (UID: \"475ff3bc-195d-4768-892d-1c0274b3a25c\") " pod="metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.544559 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn82b\" (UniqueName: \"kubernetes.io/projected/475ff3bc-195d-4768-892d-1c0274b3a25c-kube-api-access-fn82b\") pod \"metallb-operator-controller-manager-84dbcb4757-pvzfq\" (UID: \"475ff3bc-195d-4768-892d-1c0274b3a25c\") " pod="metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.544643 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/475ff3bc-195d-4768-892d-1c0274b3a25c-webhook-cert\") pod \"metallb-operator-controller-manager-84dbcb4757-pvzfq\" (UID: \"475ff3bc-195d-4768-892d-1c0274b3a25c\") " pod="metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.645827 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/475ff3bc-195d-4768-892d-1c0274b3a25c-webhook-cert\") pod \"metallb-operator-controller-manager-84dbcb4757-pvzfq\" (UID: \"475ff3bc-195d-4768-892d-1c0274b3a25c\") " pod="metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.645900 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/475ff3bc-195d-4768-892d-1c0274b3a25c-apiservice-cert\") pod \"metallb-operator-controller-manager-84dbcb4757-pvzfq\" (UID: \"475ff3bc-195d-4768-892d-1c0274b3a25c\") " pod="metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.645924 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn82b\" (UniqueName: \"kubernetes.io/projected/475ff3bc-195d-4768-892d-1c0274b3a25c-kube-api-access-fn82b\") pod \"metallb-operator-controller-manager-84dbcb4757-pvzfq\" (UID: \"475ff3bc-195d-4768-892d-1c0274b3a25c\") " pod="metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.651982 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/475ff3bc-195d-4768-892d-1c0274b3a25c-apiservice-cert\") pod \"metallb-operator-controller-manager-84dbcb4757-pvzfq\" (UID: \"475ff3bc-195d-4768-892d-1c0274b3a25c\") " pod="metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.662153 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn82b\" (UniqueName: \"kubernetes.io/projected/475ff3bc-195d-4768-892d-1c0274b3a25c-kube-api-access-fn82b\") pod \"metallb-operator-controller-manager-84dbcb4757-pvzfq\" (UID: \"475ff3bc-195d-4768-892d-1c0274b3a25c\") " pod="metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.665720 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/475ff3bc-195d-4768-892d-1c0274b3a25c-webhook-cert\") pod \"metallb-operator-controller-manager-84dbcb4757-pvzfq\" (UID: \"475ff3bc-195d-4768-892d-1c0274b3a25c\") " pod="metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.731653 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-79448499bc-9ssng"] Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.732689 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79448499bc-9ssng" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.738409 4923 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.738813 4923 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-6grtw" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.743544 4923 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.752519 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79448499bc-9ssng"] Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.752557 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fcd559c-f2fb-455a-9cc5-6be1cd6be98a-webhook-cert\") pod \"metallb-operator-webhook-server-79448499bc-9ssng\" (UID: \"0fcd559c-f2fb-455a-9cc5-6be1cd6be98a\") " pod="metallb-system/metallb-operator-webhook-server-79448499bc-9ssng" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.752632 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9gsj\" (UniqueName: \"kubernetes.io/projected/0fcd559c-f2fb-455a-9cc5-6be1cd6be98a-kube-api-access-m9gsj\") pod \"metallb-operator-webhook-server-79448499bc-9ssng\" (UID: \"0fcd559c-f2fb-455a-9cc5-6be1cd6be98a\") " pod="metallb-system/metallb-operator-webhook-server-79448499bc-9ssng" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.752815 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fcd559c-f2fb-455a-9cc5-6be1cd6be98a-apiservice-cert\") pod \"metallb-operator-webhook-server-79448499bc-9ssng\" (UID: \"0fcd559c-f2fb-455a-9cc5-6be1cd6be98a\") " pod="metallb-system/metallb-operator-webhook-server-79448499bc-9ssng" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.816188 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.860458 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fcd559c-f2fb-455a-9cc5-6be1cd6be98a-apiservice-cert\") pod \"metallb-operator-webhook-server-79448499bc-9ssng\" (UID: \"0fcd559c-f2fb-455a-9cc5-6be1cd6be98a\") " pod="metallb-system/metallb-operator-webhook-server-79448499bc-9ssng" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.860690 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fcd559c-f2fb-455a-9cc5-6be1cd6be98a-webhook-cert\") pod \"metallb-operator-webhook-server-79448499bc-9ssng\" (UID: \"0fcd559c-f2fb-455a-9cc5-6be1cd6be98a\") " pod="metallb-system/metallb-operator-webhook-server-79448499bc-9ssng" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.860796 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9gsj\" (UniqueName: \"kubernetes.io/projected/0fcd559c-f2fb-455a-9cc5-6be1cd6be98a-kube-api-access-m9gsj\") pod \"metallb-operator-webhook-server-79448499bc-9ssng\" (UID: \"0fcd559c-f2fb-455a-9cc5-6be1cd6be98a\") " pod="metallb-system/metallb-operator-webhook-server-79448499bc-9ssng" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.865939 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fcd559c-f2fb-455a-9cc5-6be1cd6be98a-webhook-cert\") pod \"metallb-operator-webhook-server-79448499bc-9ssng\" (UID: \"0fcd559c-f2fb-455a-9cc5-6be1cd6be98a\") " pod="metallb-system/metallb-operator-webhook-server-79448499bc-9ssng" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.866660 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fcd559c-f2fb-455a-9cc5-6be1cd6be98a-apiservice-cert\") pod \"metallb-operator-webhook-server-79448499bc-9ssng\" (UID: \"0fcd559c-f2fb-455a-9cc5-6be1cd6be98a\") " pod="metallb-system/metallb-operator-webhook-server-79448499bc-9ssng" Feb 24 03:08:18 crc kubenswrapper[4923]: I0224 03:08:18.889067 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9gsj\" (UniqueName: \"kubernetes.io/projected/0fcd559c-f2fb-455a-9cc5-6be1cd6be98a-kube-api-access-m9gsj\") pod \"metallb-operator-webhook-server-79448499bc-9ssng\" (UID: \"0fcd559c-f2fb-455a-9cc5-6be1cd6be98a\") " pod="metallb-system/metallb-operator-webhook-server-79448499bc-9ssng" Feb 24 03:08:19 crc kubenswrapper[4923]: I0224 03:08:19.048749 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79448499bc-9ssng" Feb 24 03:08:19 crc kubenswrapper[4923]: I0224 03:08:19.267491 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq"] Feb 24 03:08:19 crc kubenswrapper[4923]: W0224 03:08:19.270440 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod475ff3bc_195d_4768_892d_1c0274b3a25c.slice/crio-e8589ca371be2105c917706f02e8fc4438c4851facd127fc7e52969d1ae6c45a WatchSource:0}: Error finding container e8589ca371be2105c917706f02e8fc4438c4851facd127fc7e52969d1ae6c45a: Status 404 returned error can't find the container with id e8589ca371be2105c917706f02e8fc4438c4851facd127fc7e52969d1ae6c45a Feb 24 03:08:19 crc kubenswrapper[4923]: I0224 03:08:19.305455 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79448499bc-9ssng"] Feb 24 03:08:19 crc kubenswrapper[4923]: W0224 03:08:19.312706 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fcd559c_f2fb_455a_9cc5_6be1cd6be98a.slice/crio-68a35eef2bf6c3021a4820c479c6844b90974f129d6a32cff11a2c7e8483be6d WatchSource:0}: Error finding container 68a35eef2bf6c3021a4820c479c6844b90974f129d6a32cff11a2c7e8483be6d: Status 404 returned error can't find the container with id 68a35eef2bf6c3021a4820c479c6844b90974f129d6a32cff11a2c7e8483be6d Feb 24 03:08:19 crc kubenswrapper[4923]: I0224 03:08:19.890181 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq" event={"ID":"475ff3bc-195d-4768-892d-1c0274b3a25c","Type":"ContainerStarted","Data":"e8589ca371be2105c917706f02e8fc4438c4851facd127fc7e52969d1ae6c45a"} Feb 24 03:08:19 crc kubenswrapper[4923]: I0224 03:08:19.891621 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79448499bc-9ssng" event={"ID":"0fcd559c-f2fb-455a-9cc5-6be1cd6be98a","Type":"ContainerStarted","Data":"68a35eef2bf6c3021a4820c479c6844b90974f129d6a32cff11a2c7e8483be6d"} Feb 24 03:08:23 crc kubenswrapper[4923]: I0224 03:08:23.930569 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq" event={"ID":"475ff3bc-195d-4768-892d-1c0274b3a25c","Type":"ContainerStarted","Data":"c8d8bf2726b1cf2d8ba46ef30984679707f197520f13479a3da838254e1c1dfb"} Feb 24 03:08:23 crc kubenswrapper[4923]: I0224 03:08:23.930916 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq" Feb 24 03:08:23 crc kubenswrapper[4923]: I0224 03:08:23.934391 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79448499bc-9ssng" event={"ID":"0fcd559c-f2fb-455a-9cc5-6be1cd6be98a","Type":"ContainerStarted","Data":"450d7cec57ce5227ba94d06697d200934c50e43fedcbe4ba29e8af48a4c0b996"} Feb 24 03:08:23 crc kubenswrapper[4923]: I0224 03:08:23.934537 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-79448499bc-9ssng" Feb 24 03:08:23 crc kubenswrapper[4923]: I0224 03:08:23.976224 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq" podStartSLOduration=1.616873022 podStartE2EDuration="5.97620904s" podCreationTimestamp="2026-02-24 03:08:18 +0000 UTC" firstStartedPulling="2026-02-24 03:08:19.273433283 +0000 UTC m=+823.290504096" lastFinishedPulling="2026-02-24 03:08:23.632769301 +0000 UTC m=+827.649840114" observedRunningTime="2026-02-24 03:08:23.964372574 +0000 UTC m=+827.981443397" watchObservedRunningTime="2026-02-24 03:08:23.97620904 +0000 UTC m=+827.993279853" Feb 24 03:08:24 crc kubenswrapper[4923]: I0224 03:08:24.004122 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-79448499bc-9ssng" podStartSLOduration=1.6581454949999999 podStartE2EDuration="6.004108798s" podCreationTimestamp="2026-02-24 03:08:18 +0000 UTC" firstStartedPulling="2026-02-24 03:08:19.317788793 +0000 UTC m=+823.334859596" lastFinishedPulling="2026-02-24 03:08:23.663752086 +0000 UTC m=+827.680822899" observedRunningTime="2026-02-24 03:08:24.001329378 +0000 UTC m=+828.018400181" watchObservedRunningTime="2026-02-24 03:08:24.004108798 +0000 UTC m=+828.021179611" Feb 24 03:08:36 crc kubenswrapper[4923]: I0224 03:08:36.768657 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wnpc4"] Feb 24 03:08:36 crc kubenswrapper[4923]: I0224 03:08:36.770099 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wnpc4" Feb 24 03:08:36 crc kubenswrapper[4923]: I0224 03:08:36.783905 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wnpc4"] Feb 24 03:08:36 crc kubenswrapper[4923]: I0224 03:08:36.805928 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a8b158-0179-4f72-b2f0-04ed304812c2-utilities\") pod \"certified-operators-wnpc4\" (UID: \"13a8b158-0179-4f72-b2f0-04ed304812c2\") " pod="openshift-marketplace/certified-operators-wnpc4" Feb 24 03:08:36 crc kubenswrapper[4923]: I0224 03:08:36.806034 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhgf6\" (UniqueName: \"kubernetes.io/projected/13a8b158-0179-4f72-b2f0-04ed304812c2-kube-api-access-bhgf6\") pod \"certified-operators-wnpc4\" (UID: \"13a8b158-0179-4f72-b2f0-04ed304812c2\") " pod="openshift-marketplace/certified-operators-wnpc4" Feb 24 03:08:36 crc kubenswrapper[4923]: I0224 03:08:36.806140 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a8b158-0179-4f72-b2f0-04ed304812c2-catalog-content\") pod \"certified-operators-wnpc4\" (UID: \"13a8b158-0179-4f72-b2f0-04ed304812c2\") " pod="openshift-marketplace/certified-operators-wnpc4" Feb 24 03:08:36 crc kubenswrapper[4923]: I0224 03:08:36.907100 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a8b158-0179-4f72-b2f0-04ed304812c2-utilities\") pod \"certified-operators-wnpc4\" (UID: \"13a8b158-0179-4f72-b2f0-04ed304812c2\") " pod="openshift-marketplace/certified-operators-wnpc4" Feb 24 03:08:36 crc kubenswrapper[4923]: I0224 03:08:36.907182 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhgf6\" (UniqueName: \"kubernetes.io/projected/13a8b158-0179-4f72-b2f0-04ed304812c2-kube-api-access-bhgf6\") pod \"certified-operators-wnpc4\" (UID: \"13a8b158-0179-4f72-b2f0-04ed304812c2\") " pod="openshift-marketplace/certified-operators-wnpc4" Feb 24 03:08:36 crc kubenswrapper[4923]: I0224 03:08:36.907222 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a8b158-0179-4f72-b2f0-04ed304812c2-catalog-content\") pod \"certified-operators-wnpc4\" (UID: \"13a8b158-0179-4f72-b2f0-04ed304812c2\") " pod="openshift-marketplace/certified-operators-wnpc4" Feb 24 03:08:36 crc kubenswrapper[4923]: I0224 03:08:36.907622 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a8b158-0179-4f72-b2f0-04ed304812c2-utilities\") pod \"certified-operators-wnpc4\" (UID: \"13a8b158-0179-4f72-b2f0-04ed304812c2\") " pod="openshift-marketplace/certified-operators-wnpc4" Feb 24 03:08:36 crc kubenswrapper[4923]: I0224 03:08:36.907755 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a8b158-0179-4f72-b2f0-04ed304812c2-catalog-content\") pod \"certified-operators-wnpc4\" (UID: \"13a8b158-0179-4f72-b2f0-04ed304812c2\") " pod="openshift-marketplace/certified-operators-wnpc4" Feb 24 03:08:36 crc kubenswrapper[4923]: I0224 03:08:36.930224 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhgf6\" (UniqueName: \"kubernetes.io/projected/13a8b158-0179-4f72-b2f0-04ed304812c2-kube-api-access-bhgf6\") pod \"certified-operators-wnpc4\" (UID: \"13a8b158-0179-4f72-b2f0-04ed304812c2\") " pod="openshift-marketplace/certified-operators-wnpc4" Feb 24 03:08:37 crc kubenswrapper[4923]: I0224 03:08:37.087620 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wnpc4" Feb 24 03:08:37 crc kubenswrapper[4923]: I0224 03:08:37.344106 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wnpc4"] Feb 24 03:08:38 crc kubenswrapper[4923]: I0224 03:08:38.051179 4923 generic.go:334] "Generic (PLEG): container finished" podID="13a8b158-0179-4f72-b2f0-04ed304812c2" containerID="cd399c2ae29203e565db7d2aae242019ec95328e3084222da81ebc3751dbe768" exitCode=0 Feb 24 03:08:38 crc kubenswrapper[4923]: I0224 03:08:38.051252 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wnpc4" event={"ID":"13a8b158-0179-4f72-b2f0-04ed304812c2","Type":"ContainerDied","Data":"cd399c2ae29203e565db7d2aae242019ec95328e3084222da81ebc3751dbe768"} Feb 24 03:08:38 crc kubenswrapper[4923]: I0224 03:08:38.051609 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wnpc4" event={"ID":"13a8b158-0179-4f72-b2f0-04ed304812c2","Type":"ContainerStarted","Data":"092b95544f944cb8eefe2b2bfce01f62934a8eb94a146319dc1e07fe20a93758"} Feb 24 03:08:39 crc kubenswrapper[4923]: I0224 03:08:39.057447 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-79448499bc-9ssng" Feb 24 03:08:41 crc kubenswrapper[4923]: I0224 03:08:41.071093 4923 generic.go:334] "Generic (PLEG): container finished" podID="13a8b158-0179-4f72-b2f0-04ed304812c2" containerID="e8e87801c4691df0613d0092f0bb94f0bea0034e08c1a9078d2b59bcb3eb7b68" exitCode=0 Feb 24 03:08:41 crc kubenswrapper[4923]: I0224 03:08:41.071162 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wnpc4" event={"ID":"13a8b158-0179-4f72-b2f0-04ed304812c2","Type":"ContainerDied","Data":"e8e87801c4691df0613d0092f0bb94f0bea0034e08c1a9078d2b59bcb3eb7b68"} Feb 24 03:08:42 crc kubenswrapper[4923]: I0224 03:08:42.080367 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wnpc4" event={"ID":"13a8b158-0179-4f72-b2f0-04ed304812c2","Type":"ContainerStarted","Data":"8adcc0c492d7b611d1104ba2a6135ee9d32bb8cbcb4f34738d616fafd5ffd014"} Feb 24 03:08:47 crc kubenswrapper[4923]: I0224 03:08:47.088005 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wnpc4" Feb 24 03:08:47 crc kubenswrapper[4923]: I0224 03:08:47.088055 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wnpc4" Feb 24 03:08:47 crc kubenswrapper[4923]: I0224 03:08:47.128185 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wnpc4" Feb 24 03:08:47 crc kubenswrapper[4923]: I0224 03:08:47.146548 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wnpc4" podStartSLOduration=7.747526941 podStartE2EDuration="11.146529891s" podCreationTimestamp="2026-02-24 03:08:36 +0000 UTC" firstStartedPulling="2026-02-24 03:08:38.053992021 +0000 UTC m=+842.071062834" lastFinishedPulling="2026-02-24 03:08:41.452994971 +0000 UTC m=+845.470065784" observedRunningTime="2026-02-24 03:08:42.136829832 +0000 UTC m=+846.153900645" watchObservedRunningTime="2026-02-24 03:08:47.146529891 +0000 UTC m=+851.163600714" Feb 24 03:08:47 crc kubenswrapper[4923]: I0224 03:08:47.176902 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wnpc4" Feb 24 03:08:49 crc kubenswrapper[4923]: I0224 03:08:49.362212 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wnpc4"] Feb 24 03:08:49 crc kubenswrapper[4923]: I0224 03:08:49.362752 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wnpc4" podUID="13a8b158-0179-4f72-b2f0-04ed304812c2" containerName="registry-server" containerID="cri-o://8adcc0c492d7b611d1104ba2a6135ee9d32bb8cbcb4f34738d616fafd5ffd014" gracePeriod=2 Feb 24 03:08:49 crc kubenswrapper[4923]: I0224 03:08:49.712904 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wnpc4" Feb 24 03:08:49 crc kubenswrapper[4923]: I0224 03:08:49.774965 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a8b158-0179-4f72-b2f0-04ed304812c2-utilities\") pod \"13a8b158-0179-4f72-b2f0-04ed304812c2\" (UID: \"13a8b158-0179-4f72-b2f0-04ed304812c2\") " Feb 24 03:08:49 crc kubenswrapper[4923]: I0224 03:08:49.775064 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a8b158-0179-4f72-b2f0-04ed304812c2-catalog-content\") pod \"13a8b158-0179-4f72-b2f0-04ed304812c2\" (UID: \"13a8b158-0179-4f72-b2f0-04ed304812c2\") " Feb 24 03:08:49 crc kubenswrapper[4923]: I0224 03:08:49.775106 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhgf6\" (UniqueName: \"kubernetes.io/projected/13a8b158-0179-4f72-b2f0-04ed304812c2-kube-api-access-bhgf6\") pod \"13a8b158-0179-4f72-b2f0-04ed304812c2\" (UID: \"13a8b158-0179-4f72-b2f0-04ed304812c2\") " Feb 24 03:08:49 crc kubenswrapper[4923]: I0224 03:08:49.776894 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a8b158-0179-4f72-b2f0-04ed304812c2-utilities" (OuterVolumeSpecName: "utilities") pod "13a8b158-0179-4f72-b2f0-04ed304812c2" (UID: "13a8b158-0179-4f72-b2f0-04ed304812c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:08:49 crc kubenswrapper[4923]: I0224 03:08:49.784580 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a8b158-0179-4f72-b2f0-04ed304812c2-kube-api-access-bhgf6" (OuterVolumeSpecName: "kube-api-access-bhgf6") pod "13a8b158-0179-4f72-b2f0-04ed304812c2" (UID: "13a8b158-0179-4f72-b2f0-04ed304812c2"). InnerVolumeSpecName "kube-api-access-bhgf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:08:49 crc kubenswrapper[4923]: I0224 03:08:49.846513 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a8b158-0179-4f72-b2f0-04ed304812c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13a8b158-0179-4f72-b2f0-04ed304812c2" (UID: "13a8b158-0179-4f72-b2f0-04ed304812c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:08:49 crc kubenswrapper[4923]: I0224 03:08:49.876530 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13a8b158-0179-4f72-b2f0-04ed304812c2-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:08:49 crc kubenswrapper[4923]: I0224 03:08:49.876561 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13a8b158-0179-4f72-b2f0-04ed304812c2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:08:49 crc kubenswrapper[4923]: I0224 03:08:49.876571 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhgf6\" (UniqueName: \"kubernetes.io/projected/13a8b158-0179-4f72-b2f0-04ed304812c2-kube-api-access-bhgf6\") on node \"crc\" DevicePath \"\"" Feb 24 03:08:50 crc kubenswrapper[4923]: I0224 03:08:50.132397 4923 generic.go:334] "Generic (PLEG): container finished" podID="13a8b158-0179-4f72-b2f0-04ed304812c2" containerID="8adcc0c492d7b611d1104ba2a6135ee9d32bb8cbcb4f34738d616fafd5ffd014" exitCode=0 Feb 24 03:08:50 crc kubenswrapper[4923]: I0224 03:08:50.132454 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wnpc4" event={"ID":"13a8b158-0179-4f72-b2f0-04ed304812c2","Type":"ContainerDied","Data":"8adcc0c492d7b611d1104ba2a6135ee9d32bb8cbcb4f34738d616fafd5ffd014"} Feb 24 03:08:50 crc kubenswrapper[4923]: I0224 03:08:50.132471 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wnpc4" Feb 24 03:08:50 crc kubenswrapper[4923]: I0224 03:08:50.132499 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wnpc4" event={"ID":"13a8b158-0179-4f72-b2f0-04ed304812c2","Type":"ContainerDied","Data":"092b95544f944cb8eefe2b2bfce01f62934a8eb94a146319dc1e07fe20a93758"} Feb 24 03:08:50 crc kubenswrapper[4923]: I0224 03:08:50.132525 4923 scope.go:117] "RemoveContainer" containerID="8adcc0c492d7b611d1104ba2a6135ee9d32bb8cbcb4f34738d616fafd5ffd014" Feb 24 03:08:50 crc kubenswrapper[4923]: I0224 03:08:50.160352 4923 scope.go:117] "RemoveContainer" containerID="e8e87801c4691df0613d0092f0bb94f0bea0034e08c1a9078d2b59bcb3eb7b68" Feb 24 03:08:50 crc kubenswrapper[4923]: I0224 03:08:50.178680 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wnpc4"] Feb 24 03:08:50 crc kubenswrapper[4923]: I0224 03:08:50.181790 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wnpc4"] Feb 24 03:08:50 crc kubenswrapper[4923]: I0224 03:08:50.185322 4923 scope.go:117] "RemoveContainer" containerID="cd399c2ae29203e565db7d2aae242019ec95328e3084222da81ebc3751dbe768" Feb 24 03:08:50 crc kubenswrapper[4923]: I0224 03:08:50.227447 4923 scope.go:117] "RemoveContainer" containerID="8adcc0c492d7b611d1104ba2a6135ee9d32bb8cbcb4f34738d616fafd5ffd014" Feb 24 03:08:50 crc kubenswrapper[4923]: E0224 03:08:50.228817 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8adcc0c492d7b611d1104ba2a6135ee9d32bb8cbcb4f34738d616fafd5ffd014\": container with ID starting with 8adcc0c492d7b611d1104ba2a6135ee9d32bb8cbcb4f34738d616fafd5ffd014 not found: ID does not exist" containerID="8adcc0c492d7b611d1104ba2a6135ee9d32bb8cbcb4f34738d616fafd5ffd014" Feb 24 03:08:50 crc kubenswrapper[4923]: I0224 03:08:50.228856 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8adcc0c492d7b611d1104ba2a6135ee9d32bb8cbcb4f34738d616fafd5ffd014"} err="failed to get container status \"8adcc0c492d7b611d1104ba2a6135ee9d32bb8cbcb4f34738d616fafd5ffd014\": rpc error: code = NotFound desc = could not find container \"8adcc0c492d7b611d1104ba2a6135ee9d32bb8cbcb4f34738d616fafd5ffd014\": container with ID starting with 8adcc0c492d7b611d1104ba2a6135ee9d32bb8cbcb4f34738d616fafd5ffd014 not found: ID does not exist" Feb 24 03:08:50 crc kubenswrapper[4923]: I0224 03:08:50.228881 4923 scope.go:117] "RemoveContainer" containerID="e8e87801c4691df0613d0092f0bb94f0bea0034e08c1a9078d2b59bcb3eb7b68" Feb 24 03:08:50 crc kubenswrapper[4923]: E0224 03:08:50.229734 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8e87801c4691df0613d0092f0bb94f0bea0034e08c1a9078d2b59bcb3eb7b68\": container with ID starting with e8e87801c4691df0613d0092f0bb94f0bea0034e08c1a9078d2b59bcb3eb7b68 not found: ID does not exist" containerID="e8e87801c4691df0613d0092f0bb94f0bea0034e08c1a9078d2b59bcb3eb7b68" Feb 24 03:08:50 crc kubenswrapper[4923]: I0224 03:08:50.229821 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8e87801c4691df0613d0092f0bb94f0bea0034e08c1a9078d2b59bcb3eb7b68"} err="failed to get container status \"e8e87801c4691df0613d0092f0bb94f0bea0034e08c1a9078d2b59bcb3eb7b68\": rpc error: code = NotFound desc = could not find container \"e8e87801c4691df0613d0092f0bb94f0bea0034e08c1a9078d2b59bcb3eb7b68\": container with ID starting with e8e87801c4691df0613d0092f0bb94f0bea0034e08c1a9078d2b59bcb3eb7b68 not found: ID does not exist" Feb 24 03:08:50 crc kubenswrapper[4923]: I0224 03:08:50.229870 4923 scope.go:117] "RemoveContainer" containerID="cd399c2ae29203e565db7d2aae242019ec95328e3084222da81ebc3751dbe768" Feb 24 03:08:50 crc kubenswrapper[4923]: E0224 03:08:50.230351 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd399c2ae29203e565db7d2aae242019ec95328e3084222da81ebc3751dbe768\": container with ID starting with cd399c2ae29203e565db7d2aae242019ec95328e3084222da81ebc3751dbe768 not found: ID does not exist" containerID="cd399c2ae29203e565db7d2aae242019ec95328e3084222da81ebc3751dbe768" Feb 24 03:08:50 crc kubenswrapper[4923]: I0224 03:08:50.230412 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd399c2ae29203e565db7d2aae242019ec95328e3084222da81ebc3751dbe768"} err="failed to get container status \"cd399c2ae29203e565db7d2aae242019ec95328e3084222da81ebc3751dbe768\": rpc error: code = NotFound desc = could not find container \"cd399c2ae29203e565db7d2aae242019ec95328e3084222da81ebc3751dbe768\": container with ID starting with cd399c2ae29203e565db7d2aae242019ec95328e3084222da81ebc3751dbe768 not found: ID does not exist" Feb 24 03:08:51 crc kubenswrapper[4923]: I0224 03:08:51.724332 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a8b158-0179-4f72-b2f0-04ed304812c2" path="/var/lib/kubelet/pods/13a8b158-0179-4f72-b2f0-04ed304812c2/volumes" Feb 24 03:08:58 crc kubenswrapper[4923]: I0224 03:08:58.820366 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-84dbcb4757-pvzfq" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.647462 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-zj8qg"] Feb 24 03:08:59 crc kubenswrapper[4923]: E0224 03:08:59.648203 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a8b158-0179-4f72-b2f0-04ed304812c2" containerName="extract-content" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.648228 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a8b158-0179-4f72-b2f0-04ed304812c2" containerName="extract-content" Feb 24 03:08:59 crc kubenswrapper[4923]: E0224 03:08:59.648249 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a8b158-0179-4f72-b2f0-04ed304812c2" containerName="registry-server" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.648260 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a8b158-0179-4f72-b2f0-04ed304812c2" containerName="registry-server" Feb 24 03:08:59 crc kubenswrapper[4923]: E0224 03:08:59.648315 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a8b158-0179-4f72-b2f0-04ed304812c2" containerName="extract-utilities" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.648328 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a8b158-0179-4f72-b2f0-04ed304812c2" containerName="extract-utilities" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.648493 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a8b158-0179-4f72-b2f0-04ed304812c2" containerName="registry-server" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.651815 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.654051 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.655503 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-8hjgx"] Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.656704 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8hjgx" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.658510 4923 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.658510 4923 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-kp6w2" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.661481 4923 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.667769 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-8hjgx"] Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.715380 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm6j7\" (UniqueName: \"kubernetes.io/projected/20a6c50f-c649-420b-b092-7b2015b8436e-kube-api-access-bm6j7\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.715423 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/20a6c50f-c649-420b-b092-7b2015b8436e-frr-sockets\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.715444 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a6c50f-c649-420b-b092-7b2015b8436e-metrics-certs\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.715470 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qbk7\" (UniqueName: \"kubernetes.io/projected/b9c601e3-2c93-4989-a6e9-20542436ace6-kube-api-access-5qbk7\") pod \"frr-k8s-webhook-server-78b44bf5bb-8hjgx\" (UID: \"b9c601e3-2c93-4989-a6e9-20542436ace6\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8hjgx" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.715487 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/20a6c50f-c649-420b-b092-7b2015b8436e-metrics\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.715503 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/20a6c50f-c649-420b-b092-7b2015b8436e-frr-startup\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.715520 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/20a6c50f-c649-420b-b092-7b2015b8436e-reloader\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.715547 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9c601e3-2c93-4989-a6e9-20542436ace6-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-8hjgx\" (UID: \"b9c601e3-2c93-4989-a6e9-20542436ace6\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8hjgx" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.715562 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/20a6c50f-c649-420b-b092-7b2015b8436e-frr-conf\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.735760 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-k5l69"] Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.737003 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-k5l69" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.738933 4923 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.739152 4923 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-n22wc" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.739351 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.739585 4923 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.760408 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-7bl75"] Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.761221 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-7bl75" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.765690 4923 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.770092 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-7bl75"] Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.816379 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-metrics-certs\") pod \"speaker-k5l69\" (UID: \"94515a6b-ba32-4b66-9cbf-42f9e0e38d14\") " pod="metallb-system/speaker-k5l69" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.816416 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebf12b42-c896-4b13-954c-1ef5753c3fc0-metrics-certs\") pod \"controller-69bbfbf88f-7bl75\" (UID: \"ebf12b42-c896-4b13-954c-1ef5753c3fc0\") " pod="metallb-system/controller-69bbfbf88f-7bl75" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.816436 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qbk7\" (UniqueName: \"kubernetes.io/projected/b9c601e3-2c93-4989-a6e9-20542436ace6-kube-api-access-5qbk7\") pod \"frr-k8s-webhook-server-78b44bf5bb-8hjgx\" (UID: \"b9c601e3-2c93-4989-a6e9-20542436ace6\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8hjgx" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.816459 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/20a6c50f-c649-420b-b092-7b2015b8436e-metrics\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.816513 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/20a6c50f-c649-420b-b092-7b2015b8436e-frr-startup\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.816540 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/20a6c50f-c649-420b-b092-7b2015b8436e-reloader\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.816575 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebf12b42-c896-4b13-954c-1ef5753c3fc0-cert\") pod \"controller-69bbfbf88f-7bl75\" (UID: \"ebf12b42-c896-4b13-954c-1ef5753c3fc0\") " pod="metallb-system/controller-69bbfbf88f-7bl75" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.816612 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9c601e3-2c93-4989-a6e9-20542436ace6-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-8hjgx\" (UID: \"b9c601e3-2c93-4989-a6e9-20542436ace6\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8hjgx" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.816640 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/20a6c50f-c649-420b-b092-7b2015b8436e-frr-conf\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.816686 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp8dw\" (UniqueName: \"kubernetes.io/projected/ebf12b42-c896-4b13-954c-1ef5753c3fc0-kube-api-access-dp8dw\") pod \"controller-69bbfbf88f-7bl75\" (UID: \"ebf12b42-c896-4b13-954c-1ef5753c3fc0\") " pod="metallb-system/controller-69bbfbf88f-7bl75" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.816706 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qrrz\" (UniqueName: \"kubernetes.io/projected/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-kube-api-access-7qrrz\") pod \"speaker-k5l69\" (UID: \"94515a6b-ba32-4b66-9cbf-42f9e0e38d14\") " pod="metallb-system/speaker-k5l69" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.816775 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/20a6c50f-c649-420b-b092-7b2015b8436e-metrics\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.816791 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm6j7\" (UniqueName: \"kubernetes.io/projected/20a6c50f-c649-420b-b092-7b2015b8436e-kube-api-access-bm6j7\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.816817 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-metallb-excludel2\") pod \"speaker-k5l69\" (UID: \"94515a6b-ba32-4b66-9cbf-42f9e0e38d14\") " pod="metallb-system/speaker-k5l69" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.816875 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/20a6c50f-c649-420b-b092-7b2015b8436e-frr-sockets\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.816900 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-memberlist\") pod \"speaker-k5l69\" (UID: \"94515a6b-ba32-4b66-9cbf-42f9e0e38d14\") " pod="metallb-system/speaker-k5l69" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.816948 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a6c50f-c649-420b-b092-7b2015b8436e-metrics-certs\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.817522 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/20a6c50f-c649-420b-b092-7b2015b8436e-frr-conf\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.817780 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/20a6c50f-c649-420b-b092-7b2015b8436e-frr-startup\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.818024 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/20a6c50f-c649-420b-b092-7b2015b8436e-reloader\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.818182 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/20a6c50f-c649-420b-b092-7b2015b8436e-frr-sockets\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: E0224 03:08:59.818247 4923 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 24 03:08:59 crc kubenswrapper[4923]: E0224 03:08:59.818290 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20a6c50f-c649-420b-b092-7b2015b8436e-metrics-certs podName:20a6c50f-c649-420b-b092-7b2015b8436e nodeName:}" failed. No retries permitted until 2026-02-24 03:09:00.31827563 +0000 UTC m=+864.335346443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20a6c50f-c649-420b-b092-7b2015b8436e-metrics-certs") pod "frr-k8s-zj8qg" (UID: "20a6c50f-c649-420b-b092-7b2015b8436e") : secret "frr-k8s-certs-secret" not found Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.822217 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9c601e3-2c93-4989-a6e9-20542436ace6-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-8hjgx\" (UID: \"b9c601e3-2c93-4989-a6e9-20542436ace6\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8hjgx" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.832725 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qbk7\" (UniqueName: \"kubernetes.io/projected/b9c601e3-2c93-4989-a6e9-20542436ace6-kube-api-access-5qbk7\") pod \"frr-k8s-webhook-server-78b44bf5bb-8hjgx\" (UID: \"b9c601e3-2c93-4989-a6e9-20542436ace6\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8hjgx" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.839937 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm6j7\" (UniqueName: \"kubernetes.io/projected/20a6c50f-c649-420b-b092-7b2015b8436e-kube-api-access-bm6j7\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.917776 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebf12b42-c896-4b13-954c-1ef5753c3fc0-cert\") pod \"controller-69bbfbf88f-7bl75\" (UID: \"ebf12b42-c896-4b13-954c-1ef5753c3fc0\") " pod="metallb-system/controller-69bbfbf88f-7bl75" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.917850 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp8dw\" (UniqueName: \"kubernetes.io/projected/ebf12b42-c896-4b13-954c-1ef5753c3fc0-kube-api-access-dp8dw\") pod \"controller-69bbfbf88f-7bl75\" (UID: \"ebf12b42-c896-4b13-954c-1ef5753c3fc0\") " pod="metallb-system/controller-69bbfbf88f-7bl75" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.917898 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qrrz\" (UniqueName: \"kubernetes.io/projected/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-kube-api-access-7qrrz\") pod \"speaker-k5l69\" (UID: \"94515a6b-ba32-4b66-9cbf-42f9e0e38d14\") " pod="metallb-system/speaker-k5l69" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.917934 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-metallb-excludel2\") pod \"speaker-k5l69\" (UID: \"94515a6b-ba32-4b66-9cbf-42f9e0e38d14\") " pod="metallb-system/speaker-k5l69" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.918239 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-memberlist\") pod \"speaker-k5l69\" (UID: \"94515a6b-ba32-4b66-9cbf-42f9e0e38d14\") " pod="metallb-system/speaker-k5l69" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.918278 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-metrics-certs\") pod \"speaker-k5l69\" (UID: \"94515a6b-ba32-4b66-9cbf-42f9e0e38d14\") " pod="metallb-system/speaker-k5l69" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.918308 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebf12b42-c896-4b13-954c-1ef5753c3fc0-metrics-certs\") pod \"controller-69bbfbf88f-7bl75\" (UID: \"ebf12b42-c896-4b13-954c-1ef5753c3fc0\") " pod="metallb-system/controller-69bbfbf88f-7bl75" Feb 24 03:08:59 crc kubenswrapper[4923]: E0224 03:08:59.918392 4923 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Feb 24 03:08:59 crc kubenswrapper[4923]: E0224 03:08:59.918435 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebf12b42-c896-4b13-954c-1ef5753c3fc0-metrics-certs podName:ebf12b42-c896-4b13-954c-1ef5753c3fc0 nodeName:}" failed. No retries permitted until 2026-02-24 03:09:00.418421965 +0000 UTC m=+864.435492778 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ebf12b42-c896-4b13-954c-1ef5753c3fc0-metrics-certs") pod "controller-69bbfbf88f-7bl75" (UID: "ebf12b42-c896-4b13-954c-1ef5753c3fc0") : secret "controller-certs-secret" not found Feb 24 03:08:59 crc kubenswrapper[4923]: E0224 03:08:59.918633 4923 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 24 03:08:59 crc kubenswrapper[4923]: E0224 03:08:59.918674 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-memberlist podName:94515a6b-ba32-4b66-9cbf-42f9e0e38d14 nodeName:}" failed. No retries permitted until 2026-02-24 03:09:00.418662722 +0000 UTC m=+864.435733535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-memberlist") pod "speaker-k5l69" (UID: "94515a6b-ba32-4b66-9cbf-42f9e0e38d14") : secret "metallb-memberlist" not found Feb 24 03:08:59 crc kubenswrapper[4923]: E0224 03:08:59.918719 4923 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Feb 24 03:08:59 crc kubenswrapper[4923]: E0224 03:08:59.918746 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-metrics-certs podName:94515a6b-ba32-4b66-9cbf-42f9e0e38d14 nodeName:}" failed. No retries permitted until 2026-02-24 03:09:00.418738884 +0000 UTC m=+864.435809697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-metrics-certs") pod "speaker-k5l69" (UID: "94515a6b-ba32-4b66-9cbf-42f9e0e38d14") : secret "speaker-certs-secret" not found Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.918750 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-metallb-excludel2\") pod \"speaker-k5l69\" (UID: \"94515a6b-ba32-4b66-9cbf-42f9e0e38d14\") " pod="metallb-system/speaker-k5l69" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.920419 4923 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.930726 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebf12b42-c896-4b13-954c-1ef5753c3fc0-cert\") pod \"controller-69bbfbf88f-7bl75\" (UID: \"ebf12b42-c896-4b13-954c-1ef5753c3fc0\") " pod="metallb-system/controller-69bbfbf88f-7bl75" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.935506 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp8dw\" (UniqueName: \"kubernetes.io/projected/ebf12b42-c896-4b13-954c-1ef5753c3fc0-kube-api-access-dp8dw\") pod \"controller-69bbfbf88f-7bl75\" (UID: \"ebf12b42-c896-4b13-954c-1ef5753c3fc0\") " pod="metallb-system/controller-69bbfbf88f-7bl75" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.937218 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qrrz\" (UniqueName: \"kubernetes.io/projected/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-kube-api-access-7qrrz\") pod \"speaker-k5l69\" (UID: \"94515a6b-ba32-4b66-9cbf-42f9e0e38d14\") " pod="metallb-system/speaker-k5l69" Feb 24 03:08:59 crc kubenswrapper[4923]: I0224 03:08:59.981224 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8hjgx" Feb 24 03:09:00 crc kubenswrapper[4923]: I0224 03:09:00.325321 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a6c50f-c649-420b-b092-7b2015b8436e-metrics-certs\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:09:00 crc kubenswrapper[4923]: I0224 03:09:00.328950 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a6c50f-c649-420b-b092-7b2015b8436e-metrics-certs\") pod \"frr-k8s-zj8qg\" (UID: \"20a6c50f-c649-420b-b092-7b2015b8436e\") " pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:09:00 crc kubenswrapper[4923]: I0224 03:09:00.364449 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-8hjgx"] Feb 24 03:09:00 crc kubenswrapper[4923]: I0224 03:09:00.426466 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-memberlist\") pod \"speaker-k5l69\" (UID: \"94515a6b-ba32-4b66-9cbf-42f9e0e38d14\") " pod="metallb-system/speaker-k5l69" Feb 24 03:09:00 crc kubenswrapper[4923]: I0224 03:09:00.426528 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-metrics-certs\") pod \"speaker-k5l69\" (UID: \"94515a6b-ba32-4b66-9cbf-42f9e0e38d14\") " pod="metallb-system/speaker-k5l69" Feb 24 03:09:00 crc kubenswrapper[4923]: I0224 03:09:00.426551 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebf12b42-c896-4b13-954c-1ef5753c3fc0-metrics-certs\") pod \"controller-69bbfbf88f-7bl75\" (UID: \"ebf12b42-c896-4b13-954c-1ef5753c3fc0\") " pod="metallb-system/controller-69bbfbf88f-7bl75" Feb 24 03:09:00 crc kubenswrapper[4923]: E0224 03:09:00.426772 4923 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 24 03:09:00 crc kubenswrapper[4923]: E0224 03:09:00.427515 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-memberlist podName:94515a6b-ba32-4b66-9cbf-42f9e0e38d14 nodeName:}" failed. No retries permitted until 2026-02-24 03:09:01.426889354 +0000 UTC m=+865.443960207 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-memberlist") pod "speaker-k5l69" (UID: "94515a6b-ba32-4b66-9cbf-42f9e0e38d14") : secret "metallb-memberlist" not found Feb 24 03:09:00 crc kubenswrapper[4923]: I0224 03:09:00.430628 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-metrics-certs\") pod \"speaker-k5l69\" (UID: \"94515a6b-ba32-4b66-9cbf-42f9e0e38d14\") " pod="metallb-system/speaker-k5l69" Feb 24 03:09:00 crc kubenswrapper[4923]: I0224 03:09:00.433499 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ebf12b42-c896-4b13-954c-1ef5753c3fc0-metrics-certs\") pod \"controller-69bbfbf88f-7bl75\" (UID: \"ebf12b42-c896-4b13-954c-1ef5753c3fc0\") " pod="metallb-system/controller-69bbfbf88f-7bl75" Feb 24 03:09:00 crc kubenswrapper[4923]: I0224 03:09:00.574607 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:09:00 crc kubenswrapper[4923]: I0224 03:09:00.676928 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-7bl75" Feb 24 03:09:00 crc kubenswrapper[4923]: I0224 03:09:00.880733 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-7bl75"] Feb 24 03:09:00 crc kubenswrapper[4923]: W0224 03:09:00.892841 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebf12b42_c896_4b13_954c_1ef5753c3fc0.slice/crio-94fd8a31e44f20cef82148e22e947d31ac17fdde9b74c66d7cad16335832e946 WatchSource:0}: Error finding container 94fd8a31e44f20cef82148e22e947d31ac17fdde9b74c66d7cad16335832e946: Status 404 returned error can't find the container with id 94fd8a31e44f20cef82148e22e947d31ac17fdde9b74c66d7cad16335832e946 Feb 24 03:09:01 crc kubenswrapper[4923]: I0224 03:09:01.224903 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8hjgx" event={"ID":"b9c601e3-2c93-4989-a6e9-20542436ace6","Type":"ContainerStarted","Data":"f2ab9b12ab8d47725a917b788f820febb7c6b7d93f708b9b5ded5b16e31ae3da"} Feb 24 03:09:01 crc kubenswrapper[4923]: I0224 03:09:01.226691 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-7bl75" event={"ID":"ebf12b42-c896-4b13-954c-1ef5753c3fc0","Type":"ContainerStarted","Data":"44cfabf01a9119dc338cdf426932d6ae1221e28885c700476391eaa46071f14d"} Feb 24 03:09:01 crc kubenswrapper[4923]: I0224 03:09:01.226797 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-7bl75" event={"ID":"ebf12b42-c896-4b13-954c-1ef5753c3fc0","Type":"ContainerStarted","Data":"94fd8a31e44f20cef82148e22e947d31ac17fdde9b74c66d7cad16335832e946"} Feb 24 03:09:01 crc kubenswrapper[4923]: I0224 03:09:01.227744 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zj8qg" event={"ID":"20a6c50f-c649-420b-b092-7b2015b8436e","Type":"ContainerStarted","Data":"da3376d6eda881b19534b0c80dea3fa4741366b4f7d7d63b274aa9144cd0e3c4"} Feb 24 03:09:01 crc kubenswrapper[4923]: I0224 03:09:01.446688 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-memberlist\") pod \"speaker-k5l69\" (UID: \"94515a6b-ba32-4b66-9cbf-42f9e0e38d14\") " pod="metallb-system/speaker-k5l69" Feb 24 03:09:01 crc kubenswrapper[4923]: I0224 03:09:01.459639 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/94515a6b-ba32-4b66-9cbf-42f9e0e38d14-memberlist\") pod \"speaker-k5l69\" (UID: \"94515a6b-ba32-4b66-9cbf-42f9e0e38d14\") " pod="metallb-system/speaker-k5l69" Feb 24 03:09:01 crc kubenswrapper[4923]: I0224 03:09:01.548550 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-k5l69" Feb 24 03:09:01 crc kubenswrapper[4923]: W0224 03:09:01.567879 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94515a6b_ba32_4b66_9cbf_42f9e0e38d14.slice/crio-8eba42d007197f3119453dea219578c8fd20596dda755ebd9da7d793d6e2dca2 WatchSource:0}: Error finding container 8eba42d007197f3119453dea219578c8fd20596dda755ebd9da7d793d6e2dca2: Status 404 returned error can't find the container with id 8eba42d007197f3119453dea219578c8fd20596dda755ebd9da7d793d6e2dca2 Feb 24 03:09:02 crc kubenswrapper[4923]: I0224 03:09:02.242209 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-7bl75" event={"ID":"ebf12b42-c896-4b13-954c-1ef5753c3fc0","Type":"ContainerStarted","Data":"dd1763ec06f921bc7c2e2489a94f745895c59f42913379ea9c5bf6e5672bb6e7"} Feb 24 03:09:02 crc kubenswrapper[4923]: I0224 03:09:02.242735 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-7bl75" Feb 24 03:09:02 crc kubenswrapper[4923]: I0224 03:09:02.251854 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-k5l69" event={"ID":"94515a6b-ba32-4b66-9cbf-42f9e0e38d14","Type":"ContainerStarted","Data":"e6a7a2831fed98daea6441c6875fa5ee150f47d3f1bf7d501fcbcdcc40cfd9d7"} Feb 24 03:09:02 crc kubenswrapper[4923]: I0224 03:09:02.251903 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-k5l69" event={"ID":"94515a6b-ba32-4b66-9cbf-42f9e0e38d14","Type":"ContainerStarted","Data":"95ffa6d19a64df2110489026ef41584c82313841f124223b1d21bac52700b8b9"} Feb 24 03:09:02 crc kubenswrapper[4923]: I0224 03:09:02.251915 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-k5l69" event={"ID":"94515a6b-ba32-4b66-9cbf-42f9e0e38d14","Type":"ContainerStarted","Data":"8eba42d007197f3119453dea219578c8fd20596dda755ebd9da7d793d6e2dca2"} Feb 24 03:09:02 crc kubenswrapper[4923]: I0224 03:09:02.252139 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-k5l69" Feb 24 03:09:02 crc kubenswrapper[4923]: I0224 03:09:02.271483 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-7bl75" podStartSLOduration=3.271458911 podStartE2EDuration="3.271458911s" podCreationTimestamp="2026-02-24 03:08:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:09:02.263999554 +0000 UTC m=+866.281070367" watchObservedRunningTime="2026-02-24 03:09:02.271458911 +0000 UTC m=+866.288529774" Feb 24 03:09:02 crc kubenswrapper[4923]: I0224 03:09:02.290764 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-k5l69" podStartSLOduration=3.290744671 podStartE2EDuration="3.290744671s" podCreationTimestamp="2026-02-24 03:08:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:09:02.286632402 +0000 UTC m=+866.303703215" watchObservedRunningTime="2026-02-24 03:09:02.290744671 +0000 UTC m=+866.307815484" Feb 24 03:09:07 crc kubenswrapper[4923]: I0224 03:09:07.280055 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8hjgx" event={"ID":"b9c601e3-2c93-4989-a6e9-20542436ace6","Type":"ContainerStarted","Data":"3613018b09e59e9c030113256eaa1ebd4a15d2109b440ebe6c7334102264cd68"} Feb 24 03:09:07 crc kubenswrapper[4923]: I0224 03:09:07.280419 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8hjgx" Feb 24 03:09:07 crc kubenswrapper[4923]: I0224 03:09:07.281639 4923 generic.go:334] "Generic (PLEG): container finished" podID="20a6c50f-c649-420b-b092-7b2015b8436e" containerID="195037e9c714e487ed0cd39c616da99cc00dc08a861c34cfb7d0bc6f5b16ecc6" exitCode=0 Feb 24 03:09:07 crc kubenswrapper[4923]: I0224 03:09:07.281704 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zj8qg" event={"ID":"20a6c50f-c649-420b-b092-7b2015b8436e","Type":"ContainerDied","Data":"195037e9c714e487ed0cd39c616da99cc00dc08a861c34cfb7d0bc6f5b16ecc6"} Feb 24 03:09:07 crc kubenswrapper[4923]: I0224 03:09:07.307194 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8hjgx" podStartSLOduration=1.706342026 podStartE2EDuration="8.307132797s" podCreationTimestamp="2026-02-24 03:08:59 +0000 UTC" firstStartedPulling="2026-02-24 03:09:00.375514848 +0000 UTC m=+864.392585671" lastFinishedPulling="2026-02-24 03:09:06.976305629 +0000 UTC m=+870.993376442" observedRunningTime="2026-02-24 03:09:07.29856052 +0000 UTC m=+871.315631353" watchObservedRunningTime="2026-02-24 03:09:07.307132797 +0000 UTC m=+871.324203650" Feb 24 03:09:08 crc kubenswrapper[4923]: I0224 03:09:08.292809 4923 generic.go:334] "Generic (PLEG): container finished" podID="20a6c50f-c649-420b-b092-7b2015b8436e" containerID="ba1d352923e31495e34fdbe295c79fc036849718badcc8c218faaaed75344bda" exitCode=0 Feb 24 03:09:08 crc kubenswrapper[4923]: I0224 03:09:08.292901 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zj8qg" event={"ID":"20a6c50f-c649-420b-b092-7b2015b8436e","Type":"ContainerDied","Data":"ba1d352923e31495e34fdbe295c79fc036849718badcc8c218faaaed75344bda"} Feb 24 03:09:09 crc kubenswrapper[4923]: I0224 03:09:09.302848 4923 generic.go:334] "Generic (PLEG): container finished" podID="20a6c50f-c649-420b-b092-7b2015b8436e" containerID="e3cbe3267ed2316786f347aee1405fd6c48797ffcf46860f06d3311b786acbe9" exitCode=0 Feb 24 03:09:09 crc kubenswrapper[4923]: I0224 03:09:09.302960 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zj8qg" event={"ID":"20a6c50f-c649-420b-b092-7b2015b8436e","Type":"ContainerDied","Data":"e3cbe3267ed2316786f347aee1405fd6c48797ffcf46860f06d3311b786acbe9"} Feb 24 03:09:10 crc kubenswrapper[4923]: I0224 03:09:10.316025 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zj8qg" event={"ID":"20a6c50f-c649-420b-b092-7b2015b8436e","Type":"ContainerStarted","Data":"310ce85d4d8772d65a271132e1d6f02d0989d946312ef68e4cd0631228eb5f42"} Feb 24 03:09:10 crc kubenswrapper[4923]: I0224 03:09:10.316374 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zj8qg" event={"ID":"20a6c50f-c649-420b-b092-7b2015b8436e","Type":"ContainerStarted","Data":"36b728fa5d3e8fe192c00545c49c0e08155e87a8a72610bb9ee38aa33dced144"} Feb 24 03:09:10 crc kubenswrapper[4923]: I0224 03:09:10.316385 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zj8qg" event={"ID":"20a6c50f-c649-420b-b092-7b2015b8436e","Type":"ContainerStarted","Data":"a17aebf5f205e677f03e0892824c92a24760e9d54aaa494d0b4e04a8c75b77d6"} Feb 24 03:09:10 crc kubenswrapper[4923]: I0224 03:09:10.316395 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zj8qg" event={"ID":"20a6c50f-c649-420b-b092-7b2015b8436e","Type":"ContainerStarted","Data":"a7e94575c16906360af998488a59181c27ee3310c7ed1fca391862633cf453f3"} Feb 24 03:09:10 crc kubenswrapper[4923]: I0224 03:09:10.316404 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zj8qg" event={"ID":"20a6c50f-c649-420b-b092-7b2015b8436e","Type":"ContainerStarted","Data":"886c404e7ec2fa6cda1963aad324e21981dd3fbff69d55eb44057391baac48e4"} Feb 24 03:09:11 crc kubenswrapper[4923]: I0224 03:09:11.328804 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zj8qg" event={"ID":"20a6c50f-c649-420b-b092-7b2015b8436e","Type":"ContainerStarted","Data":"f7f67169962f01c1c735aa11c971e27c227f3c02b179045aa8efc573032e42ea"} Feb 24 03:09:11 crc kubenswrapper[4923]: I0224 03:09:11.329220 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:09:11 crc kubenswrapper[4923]: I0224 03:09:11.362055 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-zj8qg" podStartSLOduration=6.15850236 podStartE2EDuration="12.36203487s" podCreationTimestamp="2026-02-24 03:08:59 +0000 UTC" firstStartedPulling="2026-02-24 03:09:00.758818351 +0000 UTC m=+864.775889194" lastFinishedPulling="2026-02-24 03:09:06.962350891 +0000 UTC m=+870.979421704" observedRunningTime="2026-02-24 03:09:11.357839149 +0000 UTC m=+875.374909962" watchObservedRunningTime="2026-02-24 03:09:11.36203487 +0000 UTC m=+875.379105693" Feb 24 03:09:11 crc kubenswrapper[4923]: I0224 03:09:11.555378 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-k5l69" Feb 24 03:09:14 crc kubenswrapper[4923]: I0224 03:09:14.321969 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-vfdpl"] Feb 24 03:09:14 crc kubenswrapper[4923]: I0224 03:09:14.323015 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vfdpl" Feb 24 03:09:14 crc kubenswrapper[4923]: I0224 03:09:14.325128 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 24 03:09:14 crc kubenswrapper[4923]: I0224 03:09:14.325127 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 24 03:09:14 crc kubenswrapper[4923]: I0224 03:09:14.325730 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-w9mdr" Feb 24 03:09:14 crc kubenswrapper[4923]: I0224 03:09:14.331221 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vfdpl"] Feb 24 03:09:14 crc kubenswrapper[4923]: I0224 03:09:14.417429 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjckd\" (UniqueName: \"kubernetes.io/projected/cef055b8-6493-4cbf-b158-fded7a4321ac-kube-api-access-zjckd\") pod \"openstack-operator-index-vfdpl\" (UID: \"cef055b8-6493-4cbf-b158-fded7a4321ac\") " pod="openstack-operators/openstack-operator-index-vfdpl" Feb 24 03:09:14 crc kubenswrapper[4923]: I0224 03:09:14.518546 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjckd\" (UniqueName: \"kubernetes.io/projected/cef055b8-6493-4cbf-b158-fded7a4321ac-kube-api-access-zjckd\") pod \"openstack-operator-index-vfdpl\" (UID: \"cef055b8-6493-4cbf-b158-fded7a4321ac\") " pod="openstack-operators/openstack-operator-index-vfdpl" Feb 24 03:09:14 crc kubenswrapper[4923]: I0224 03:09:14.536167 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjckd\" (UniqueName: \"kubernetes.io/projected/cef055b8-6493-4cbf-b158-fded7a4321ac-kube-api-access-zjckd\") pod \"openstack-operator-index-vfdpl\" (UID: \"cef055b8-6493-4cbf-b158-fded7a4321ac\") " pod="openstack-operators/openstack-operator-index-vfdpl" Feb 24 03:09:14 crc kubenswrapper[4923]: I0224 03:09:14.649423 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vfdpl" Feb 24 03:09:15 crc kubenswrapper[4923]: I0224 03:09:15.045525 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vfdpl"] Feb 24 03:09:15 crc kubenswrapper[4923]: I0224 03:09:15.356832 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vfdpl" event={"ID":"cef055b8-6493-4cbf-b158-fded7a4321ac","Type":"ContainerStarted","Data":"bf1b457e2f58736a4e5847a06c6a1f6fd3776adf3738dbb6f334074982a527dd"} Feb 24 03:09:15 crc kubenswrapper[4923]: I0224 03:09:15.574955 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:09:15 crc kubenswrapper[4923]: I0224 03:09:15.638208 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:09:17 crc kubenswrapper[4923]: I0224 03:09:17.709440 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-vfdpl"] Feb 24 03:09:18 crc kubenswrapper[4923]: I0224 03:09:18.318702 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zscrj"] Feb 24 03:09:18 crc kubenswrapper[4923]: I0224 03:09:18.319738 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zscrj" Feb 24 03:09:18 crc kubenswrapper[4923]: I0224 03:09:18.327755 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zscrj"] Feb 24 03:09:18 crc kubenswrapper[4923]: I0224 03:09:18.380841 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68t9k\" (UniqueName: \"kubernetes.io/projected/d988ae64-eb6b-4f86-a51d-5b61eb8b6d35-kube-api-access-68t9k\") pod \"openstack-operator-index-zscrj\" (UID: \"d988ae64-eb6b-4f86-a51d-5b61eb8b6d35\") " pod="openstack-operators/openstack-operator-index-zscrj" Feb 24 03:09:18 crc kubenswrapper[4923]: I0224 03:09:18.380830 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vfdpl" event={"ID":"cef055b8-6493-4cbf-b158-fded7a4321ac","Type":"ContainerStarted","Data":"8d24833848f9328828851b76d50fa7e66f2b5dc3b49a3c7cd24b482bc551b0e6"} Feb 24 03:09:18 crc kubenswrapper[4923]: I0224 03:09:18.403762 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-vfdpl" podStartSLOduration=2.027047255 podStartE2EDuration="4.403747636s" podCreationTimestamp="2026-02-24 03:09:14 +0000 UTC" firstStartedPulling="2026-02-24 03:09:15.057940131 +0000 UTC m=+879.075010944" lastFinishedPulling="2026-02-24 03:09:17.434640492 +0000 UTC m=+881.451711325" observedRunningTime="2026-02-24 03:09:18.402032491 +0000 UTC m=+882.419103304" watchObservedRunningTime="2026-02-24 03:09:18.403747636 +0000 UTC m=+882.420818449" Feb 24 03:09:18 crc kubenswrapper[4923]: I0224 03:09:18.482981 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68t9k\" (UniqueName: \"kubernetes.io/projected/d988ae64-eb6b-4f86-a51d-5b61eb8b6d35-kube-api-access-68t9k\") pod \"openstack-operator-index-zscrj\" (UID: \"d988ae64-eb6b-4f86-a51d-5b61eb8b6d35\") " pod="openstack-operators/openstack-operator-index-zscrj" Feb 24 03:09:18 crc kubenswrapper[4923]: I0224 03:09:18.502853 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68t9k\" (UniqueName: \"kubernetes.io/projected/d988ae64-eb6b-4f86-a51d-5b61eb8b6d35-kube-api-access-68t9k\") pod \"openstack-operator-index-zscrj\" (UID: \"d988ae64-eb6b-4f86-a51d-5b61eb8b6d35\") " pod="openstack-operators/openstack-operator-index-zscrj" Feb 24 03:09:18 crc kubenswrapper[4923]: I0224 03:09:18.639002 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zscrj" Feb 24 03:09:19 crc kubenswrapper[4923]: I0224 03:09:19.160335 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zscrj"] Feb 24 03:09:19 crc kubenswrapper[4923]: W0224 03:09:19.172878 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd988ae64_eb6b_4f86_a51d_5b61eb8b6d35.slice/crio-cd3cbf8393edbb855b7bda31014436a0faabc64f70710d27d1a425b16bde4303 WatchSource:0}: Error finding container cd3cbf8393edbb855b7bda31014436a0faabc64f70710d27d1a425b16bde4303: Status 404 returned error can't find the container with id cd3cbf8393edbb855b7bda31014436a0faabc64f70710d27d1a425b16bde4303 Feb 24 03:09:19 crc kubenswrapper[4923]: I0224 03:09:19.389769 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-vfdpl" podUID="cef055b8-6493-4cbf-b158-fded7a4321ac" containerName="registry-server" containerID="cri-o://8d24833848f9328828851b76d50fa7e66f2b5dc3b49a3c7cd24b482bc551b0e6" gracePeriod=2 Feb 24 03:09:19 crc kubenswrapper[4923]: I0224 03:09:19.389910 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zscrj" event={"ID":"d988ae64-eb6b-4f86-a51d-5b61eb8b6d35","Type":"ContainerStarted","Data":"cd3cbf8393edbb855b7bda31014436a0faabc64f70710d27d1a425b16bde4303"} Feb 24 03:09:19 crc kubenswrapper[4923]: I0224 03:09:19.829543 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vfdpl" Feb 24 03:09:19 crc kubenswrapper[4923]: I0224 03:09:19.903423 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjckd\" (UniqueName: \"kubernetes.io/projected/cef055b8-6493-4cbf-b158-fded7a4321ac-kube-api-access-zjckd\") pod \"cef055b8-6493-4cbf-b158-fded7a4321ac\" (UID: \"cef055b8-6493-4cbf-b158-fded7a4321ac\") " Feb 24 03:09:19 crc kubenswrapper[4923]: I0224 03:09:19.908980 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef055b8-6493-4cbf-b158-fded7a4321ac-kube-api-access-zjckd" (OuterVolumeSpecName: "kube-api-access-zjckd") pod "cef055b8-6493-4cbf-b158-fded7a4321ac" (UID: "cef055b8-6493-4cbf-b158-fded7a4321ac"). InnerVolumeSpecName "kube-api-access-zjckd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:09:19 crc kubenswrapper[4923]: I0224 03:09:19.916319 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:09:19 crc kubenswrapper[4923]: I0224 03:09:19.916380 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:09:19 crc kubenswrapper[4923]: I0224 03:09:19.990548 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8hjgx" Feb 24 03:09:20 crc kubenswrapper[4923]: I0224 03:09:20.004953 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjckd\" (UniqueName: \"kubernetes.io/projected/cef055b8-6493-4cbf-b158-fded7a4321ac-kube-api-access-zjckd\") on node \"crc\" DevicePath \"\"" Feb 24 03:09:20 crc kubenswrapper[4923]: I0224 03:09:20.404034 4923 generic.go:334] "Generic (PLEG): container finished" podID="cef055b8-6493-4cbf-b158-fded7a4321ac" containerID="8d24833848f9328828851b76d50fa7e66f2b5dc3b49a3c7cd24b482bc551b0e6" exitCode=0 Feb 24 03:09:20 crc kubenswrapper[4923]: I0224 03:09:20.404080 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vfdpl" Feb 24 03:09:20 crc kubenswrapper[4923]: I0224 03:09:20.404102 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vfdpl" event={"ID":"cef055b8-6493-4cbf-b158-fded7a4321ac","Type":"ContainerDied","Data":"8d24833848f9328828851b76d50fa7e66f2b5dc3b49a3c7cd24b482bc551b0e6"} Feb 24 03:09:20 crc kubenswrapper[4923]: I0224 03:09:20.404175 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vfdpl" event={"ID":"cef055b8-6493-4cbf-b158-fded7a4321ac","Type":"ContainerDied","Data":"bf1b457e2f58736a4e5847a06c6a1f6fd3776adf3738dbb6f334074982a527dd"} Feb 24 03:09:20 crc kubenswrapper[4923]: I0224 03:09:20.404208 4923 scope.go:117] "RemoveContainer" containerID="8d24833848f9328828851b76d50fa7e66f2b5dc3b49a3c7cd24b482bc551b0e6" Feb 24 03:09:20 crc kubenswrapper[4923]: I0224 03:09:20.410482 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zscrj" event={"ID":"d988ae64-eb6b-4f86-a51d-5b61eb8b6d35","Type":"ContainerStarted","Data":"9dfa1e24d96a7e28d98cb8bb000f8371e19f368fcf45ed4de064799f2fe2e971"} Feb 24 03:09:20 crc kubenswrapper[4923]: I0224 03:09:20.430255 4923 scope.go:117] "RemoveContainer" containerID="8d24833848f9328828851b76d50fa7e66f2b5dc3b49a3c7cd24b482bc551b0e6" Feb 24 03:09:20 crc kubenswrapper[4923]: E0224 03:09:20.430723 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d24833848f9328828851b76d50fa7e66f2b5dc3b49a3c7cd24b482bc551b0e6\": container with ID starting with 8d24833848f9328828851b76d50fa7e66f2b5dc3b49a3c7cd24b482bc551b0e6 not found: ID does not exist" containerID="8d24833848f9328828851b76d50fa7e66f2b5dc3b49a3c7cd24b482bc551b0e6" Feb 24 03:09:20 crc kubenswrapper[4923]: I0224 03:09:20.430766 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d24833848f9328828851b76d50fa7e66f2b5dc3b49a3c7cd24b482bc551b0e6"} err="failed to get container status \"8d24833848f9328828851b76d50fa7e66f2b5dc3b49a3c7cd24b482bc551b0e6\": rpc error: code = NotFound desc = could not find container \"8d24833848f9328828851b76d50fa7e66f2b5dc3b49a3c7cd24b482bc551b0e6\": container with ID starting with 8d24833848f9328828851b76d50fa7e66f2b5dc3b49a3c7cd24b482bc551b0e6 not found: ID does not exist" Feb 24 03:09:20 crc kubenswrapper[4923]: I0224 03:09:20.439898 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zscrj" podStartSLOduration=2.373027728 podStartE2EDuration="2.439871373s" podCreationTimestamp="2026-02-24 03:09:18 +0000 UTC" firstStartedPulling="2026-02-24 03:09:19.179709371 +0000 UTC m=+883.196780224" lastFinishedPulling="2026-02-24 03:09:19.246553056 +0000 UTC m=+883.263623869" observedRunningTime="2026-02-24 03:09:20.435910588 +0000 UTC m=+884.452981441" watchObservedRunningTime="2026-02-24 03:09:20.439871373 +0000 UTC m=+884.456942226" Feb 24 03:09:20 crc kubenswrapper[4923]: I0224 03:09:20.460532 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-vfdpl"] Feb 24 03:09:20 crc kubenswrapper[4923]: I0224 03:09:20.463623 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-vfdpl"] Feb 24 03:09:20 crc kubenswrapper[4923]: I0224 03:09:20.578930 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-zj8qg" Feb 24 03:09:20 crc kubenswrapper[4923]: I0224 03:09:20.682880 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-7bl75" Feb 24 03:09:21 crc kubenswrapper[4923]: I0224 03:09:21.726048 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cef055b8-6493-4cbf-b158-fded7a4321ac" path="/var/lib/kubelet/pods/cef055b8-6493-4cbf-b158-fded7a4321ac/volumes" Feb 24 03:09:22 crc kubenswrapper[4923]: I0224 03:09:22.715886 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-98cjm"] Feb 24 03:09:22 crc kubenswrapper[4923]: E0224 03:09:22.716138 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef055b8-6493-4cbf-b158-fded7a4321ac" containerName="registry-server" Feb 24 03:09:22 crc kubenswrapper[4923]: I0224 03:09:22.716151 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef055b8-6493-4cbf-b158-fded7a4321ac" containerName="registry-server" Feb 24 03:09:22 crc kubenswrapper[4923]: I0224 03:09:22.716314 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef055b8-6493-4cbf-b158-fded7a4321ac" containerName="registry-server" Feb 24 03:09:22 crc kubenswrapper[4923]: I0224 03:09:22.717704 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-98cjm" Feb 24 03:09:22 crc kubenswrapper[4923]: I0224 03:09:22.732059 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-98cjm"] Feb 24 03:09:22 crc kubenswrapper[4923]: I0224 03:09:22.879088 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a60fe915-562a-4824-99f5-5875fceebcf9-catalog-content\") pod \"redhat-marketplace-98cjm\" (UID: \"a60fe915-562a-4824-99f5-5875fceebcf9\") " pod="openshift-marketplace/redhat-marketplace-98cjm" Feb 24 03:09:22 crc kubenswrapper[4923]: I0224 03:09:22.879433 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a60fe915-562a-4824-99f5-5875fceebcf9-utilities\") pod \"redhat-marketplace-98cjm\" (UID: \"a60fe915-562a-4824-99f5-5875fceebcf9\") " pod="openshift-marketplace/redhat-marketplace-98cjm" Feb 24 03:09:22 crc kubenswrapper[4923]: I0224 03:09:22.879587 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj45g\" (UniqueName: \"kubernetes.io/projected/a60fe915-562a-4824-99f5-5875fceebcf9-kube-api-access-qj45g\") pod \"redhat-marketplace-98cjm\" (UID: \"a60fe915-562a-4824-99f5-5875fceebcf9\") " pod="openshift-marketplace/redhat-marketplace-98cjm" Feb 24 03:09:22 crc kubenswrapper[4923]: I0224 03:09:22.981285 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a60fe915-562a-4824-99f5-5875fceebcf9-utilities\") pod \"redhat-marketplace-98cjm\" (UID: \"a60fe915-562a-4824-99f5-5875fceebcf9\") " pod="openshift-marketplace/redhat-marketplace-98cjm" Feb 24 03:09:22 crc kubenswrapper[4923]: I0224 03:09:22.981364 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj45g\" (UniqueName: \"kubernetes.io/projected/a60fe915-562a-4824-99f5-5875fceebcf9-kube-api-access-qj45g\") pod \"redhat-marketplace-98cjm\" (UID: \"a60fe915-562a-4824-99f5-5875fceebcf9\") " pod="openshift-marketplace/redhat-marketplace-98cjm" Feb 24 03:09:22 crc kubenswrapper[4923]: I0224 03:09:22.981392 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a60fe915-562a-4824-99f5-5875fceebcf9-catalog-content\") pod \"redhat-marketplace-98cjm\" (UID: \"a60fe915-562a-4824-99f5-5875fceebcf9\") " pod="openshift-marketplace/redhat-marketplace-98cjm" Feb 24 03:09:22 crc kubenswrapper[4923]: I0224 03:09:22.981821 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a60fe915-562a-4824-99f5-5875fceebcf9-catalog-content\") pod \"redhat-marketplace-98cjm\" (UID: \"a60fe915-562a-4824-99f5-5875fceebcf9\") " pod="openshift-marketplace/redhat-marketplace-98cjm" Feb 24 03:09:22 crc kubenswrapper[4923]: I0224 03:09:22.982128 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a60fe915-562a-4824-99f5-5875fceebcf9-utilities\") pod \"redhat-marketplace-98cjm\" (UID: \"a60fe915-562a-4824-99f5-5875fceebcf9\") " pod="openshift-marketplace/redhat-marketplace-98cjm" Feb 24 03:09:23 crc kubenswrapper[4923]: I0224 03:09:23.001249 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj45g\" (UniqueName: \"kubernetes.io/projected/a60fe915-562a-4824-99f5-5875fceebcf9-kube-api-access-qj45g\") pod \"redhat-marketplace-98cjm\" (UID: \"a60fe915-562a-4824-99f5-5875fceebcf9\") " pod="openshift-marketplace/redhat-marketplace-98cjm" Feb 24 03:09:23 crc kubenswrapper[4923]: I0224 03:09:23.080595 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-98cjm" Feb 24 03:09:23 crc kubenswrapper[4923]: I0224 03:09:23.509485 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-98cjm"] Feb 24 03:09:25 crc kubenswrapper[4923]: I0224 03:09:25.764631 4923 generic.go:334] "Generic (PLEG): container finished" podID="a60fe915-562a-4824-99f5-5875fceebcf9" containerID="88db0598c9d63819cc1e9c91f9b0128f4da791e71cba34017eed753f493e8077" exitCode=0 Feb 24 03:09:25 crc kubenswrapper[4923]: I0224 03:09:25.766153 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98cjm" event={"ID":"a60fe915-562a-4824-99f5-5875fceebcf9","Type":"ContainerDied","Data":"88db0598c9d63819cc1e9c91f9b0128f4da791e71cba34017eed753f493e8077"} Feb 24 03:09:25 crc kubenswrapper[4923]: I0224 03:09:25.770666 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98cjm" event={"ID":"a60fe915-562a-4824-99f5-5875fceebcf9","Type":"ContainerStarted","Data":"be17e0d404a7b6c1d6e0ce4fdc57c23b04146c81f4885c5dc167a8e50e4b71b8"} Feb 24 03:09:26 crc kubenswrapper[4923]: I0224 03:09:26.781778 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98cjm" event={"ID":"a60fe915-562a-4824-99f5-5875fceebcf9","Type":"ContainerStarted","Data":"ec77a8a089013650bd41d0c6df5a3194f4d53c8fddb7ea79a4b50ff3989fda80"} Feb 24 03:09:27 crc kubenswrapper[4923]: I0224 03:09:27.790448 4923 generic.go:334] "Generic (PLEG): container finished" podID="a60fe915-562a-4824-99f5-5875fceebcf9" containerID="ec77a8a089013650bd41d0c6df5a3194f4d53c8fddb7ea79a4b50ff3989fda80" exitCode=0 Feb 24 03:09:27 crc kubenswrapper[4923]: I0224 03:09:27.790493 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98cjm" event={"ID":"a60fe915-562a-4824-99f5-5875fceebcf9","Type":"ContainerDied","Data":"ec77a8a089013650bd41d0c6df5a3194f4d53c8fddb7ea79a4b50ff3989fda80"} Feb 24 03:09:28 crc kubenswrapper[4923]: I0224 03:09:28.639448 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-zscrj" Feb 24 03:09:28 crc kubenswrapper[4923]: I0224 03:09:28.639871 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-zscrj" Feb 24 03:09:28 crc kubenswrapper[4923]: I0224 03:09:28.666465 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-zscrj" Feb 24 03:09:28 crc kubenswrapper[4923]: I0224 03:09:28.798230 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98cjm" event={"ID":"a60fe915-562a-4824-99f5-5875fceebcf9","Type":"ContainerStarted","Data":"c46d6bad162aeec3ea7a1e338eb5d0f90b289e4e938cf3dac8eb2b9571462d89"} Feb 24 03:09:28 crc kubenswrapper[4923]: I0224 03:09:28.819636 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-98cjm" podStartSLOduration=4.423094584 podStartE2EDuration="6.819615488s" podCreationTimestamp="2026-02-24 03:09:22 +0000 UTC" firstStartedPulling="2026-02-24 03:09:25.772918692 +0000 UTC m=+889.789989505" lastFinishedPulling="2026-02-24 03:09:28.169439596 +0000 UTC m=+892.186510409" observedRunningTime="2026-02-24 03:09:28.814141693 +0000 UTC m=+892.831212506" watchObservedRunningTime="2026-02-24 03:09:28.819615488 +0000 UTC m=+892.836686301" Feb 24 03:09:28 crc kubenswrapper[4923]: I0224 03:09:28.825245 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-zscrj" Feb 24 03:09:31 crc kubenswrapper[4923]: I0224 03:09:31.770047 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t"] Feb 24 03:09:31 crc kubenswrapper[4923]: I0224 03:09:31.771442 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t" Feb 24 03:09:31 crc kubenswrapper[4923]: I0224 03:09:31.773355 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-lgg6p" Feb 24 03:09:31 crc kubenswrapper[4923]: I0224 03:09:31.785565 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t"] Feb 24 03:09:31 crc kubenswrapper[4923]: I0224 03:09:31.806342 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bp2v\" (UniqueName: \"kubernetes.io/projected/0fd27016-8094-45ad-9298-4f33e7692b7e-kube-api-access-6bp2v\") pod \"5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t\" (UID: \"0fd27016-8094-45ad-9298-4f33e7692b7e\") " pod="openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t" Feb 24 03:09:31 crc kubenswrapper[4923]: I0224 03:09:31.806568 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0fd27016-8094-45ad-9298-4f33e7692b7e-util\") pod \"5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t\" (UID: \"0fd27016-8094-45ad-9298-4f33e7692b7e\") " pod="openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t" Feb 24 03:09:31 crc kubenswrapper[4923]: I0224 03:09:31.806658 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0fd27016-8094-45ad-9298-4f33e7692b7e-bundle\") pod \"5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t\" (UID: \"0fd27016-8094-45ad-9298-4f33e7692b7e\") " pod="openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t" Feb 24 03:09:31 crc kubenswrapper[4923]: I0224 03:09:31.907598 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bp2v\" (UniqueName: \"kubernetes.io/projected/0fd27016-8094-45ad-9298-4f33e7692b7e-kube-api-access-6bp2v\") pod \"5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t\" (UID: \"0fd27016-8094-45ad-9298-4f33e7692b7e\") " pod="openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t" Feb 24 03:09:31 crc kubenswrapper[4923]: I0224 03:09:31.907828 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0fd27016-8094-45ad-9298-4f33e7692b7e-util\") pod \"5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t\" (UID: \"0fd27016-8094-45ad-9298-4f33e7692b7e\") " pod="openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t" Feb 24 03:09:31 crc kubenswrapper[4923]: I0224 03:09:31.907928 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0fd27016-8094-45ad-9298-4f33e7692b7e-bundle\") pod \"5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t\" (UID: \"0fd27016-8094-45ad-9298-4f33e7692b7e\") " pod="openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t" Feb 24 03:09:31 crc kubenswrapper[4923]: I0224 03:09:31.908240 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0fd27016-8094-45ad-9298-4f33e7692b7e-util\") pod \"5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t\" (UID: \"0fd27016-8094-45ad-9298-4f33e7692b7e\") " pod="openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t" Feb 24 03:09:31 crc kubenswrapper[4923]: I0224 03:09:31.908404 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0fd27016-8094-45ad-9298-4f33e7692b7e-bundle\") pod \"5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t\" (UID: \"0fd27016-8094-45ad-9298-4f33e7692b7e\") " pod="openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t" Feb 24 03:09:31 crc kubenswrapper[4923]: I0224 03:09:31.928434 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bp2v\" (UniqueName: \"kubernetes.io/projected/0fd27016-8094-45ad-9298-4f33e7692b7e-kube-api-access-6bp2v\") pod \"5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t\" (UID: \"0fd27016-8094-45ad-9298-4f33e7692b7e\") " pod="openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t" Feb 24 03:09:32 crc kubenswrapper[4923]: I0224 03:09:32.090532 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t" Feb 24 03:09:32 crc kubenswrapper[4923]: I0224 03:09:32.315079 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t"] Feb 24 03:09:32 crc kubenswrapper[4923]: I0224 03:09:32.823668 4923 generic.go:334] "Generic (PLEG): container finished" podID="0fd27016-8094-45ad-9298-4f33e7692b7e" containerID="b6f9e20d34a627b96fc613e16bf0cfc76fc9a1f687bb47251715cd015af5dc2b" exitCode=0 Feb 24 03:09:32 crc kubenswrapper[4923]: I0224 03:09:32.823875 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t" event={"ID":"0fd27016-8094-45ad-9298-4f33e7692b7e","Type":"ContainerDied","Data":"b6f9e20d34a627b96fc613e16bf0cfc76fc9a1f687bb47251715cd015af5dc2b"} Feb 24 03:09:32 crc kubenswrapper[4923]: I0224 03:09:32.823997 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t" event={"ID":"0fd27016-8094-45ad-9298-4f33e7692b7e","Type":"ContainerStarted","Data":"aed47e8d94c143673c00b6d3223e85dd8aa59af0e470314227d74753c64331d9"} Feb 24 03:09:33 crc kubenswrapper[4923]: I0224 03:09:33.081686 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-98cjm" Feb 24 03:09:33 crc kubenswrapper[4923]: I0224 03:09:33.081741 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-98cjm" Feb 24 03:09:33 crc kubenswrapper[4923]: I0224 03:09:33.145129 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-98cjm" Feb 24 03:09:33 crc kubenswrapper[4923]: I0224 03:09:33.833385 4923 generic.go:334] "Generic (PLEG): container finished" podID="0fd27016-8094-45ad-9298-4f33e7692b7e" containerID="a1f10cc9c6328525023a08fb8e7a5272c9c17351fa427c6c9ab2d786f4b3e3dc" exitCode=0 Feb 24 03:09:33 crc kubenswrapper[4923]: I0224 03:09:33.835065 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t" event={"ID":"0fd27016-8094-45ad-9298-4f33e7692b7e","Type":"ContainerDied","Data":"a1f10cc9c6328525023a08fb8e7a5272c9c17351fa427c6c9ab2d786f4b3e3dc"} Feb 24 03:09:33 crc kubenswrapper[4923]: I0224 03:09:33.903807 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-98cjm" Feb 24 03:09:34 crc kubenswrapper[4923]: I0224 03:09:34.840340 4923 generic.go:334] "Generic (PLEG): container finished" podID="0fd27016-8094-45ad-9298-4f33e7692b7e" containerID="ca2e7e57632c1483d2a1c8e4e9a0ee5cccd67bb001b4cad6d04acc4e4417c38e" exitCode=0 Feb 24 03:09:34 crc kubenswrapper[4923]: I0224 03:09:34.840388 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t" event={"ID":"0fd27016-8094-45ad-9298-4f33e7692b7e","Type":"ContainerDied","Data":"ca2e7e57632c1483d2a1c8e4e9a0ee5cccd67bb001b4cad6d04acc4e4417c38e"} Feb 24 03:09:36 crc kubenswrapper[4923]: I0224 03:09:36.094170 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t" Feb 24 03:09:36 crc kubenswrapper[4923]: I0224 03:09:36.259251 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0fd27016-8094-45ad-9298-4f33e7692b7e-bundle\") pod \"0fd27016-8094-45ad-9298-4f33e7692b7e\" (UID: \"0fd27016-8094-45ad-9298-4f33e7692b7e\") " Feb 24 03:09:36 crc kubenswrapper[4923]: I0224 03:09:36.259325 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0fd27016-8094-45ad-9298-4f33e7692b7e-util\") pod \"0fd27016-8094-45ad-9298-4f33e7692b7e\" (UID: \"0fd27016-8094-45ad-9298-4f33e7692b7e\") " Feb 24 03:09:36 crc kubenswrapper[4923]: I0224 03:09:36.259433 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bp2v\" (UniqueName: \"kubernetes.io/projected/0fd27016-8094-45ad-9298-4f33e7692b7e-kube-api-access-6bp2v\") pod \"0fd27016-8094-45ad-9298-4f33e7692b7e\" (UID: \"0fd27016-8094-45ad-9298-4f33e7692b7e\") " Feb 24 03:09:36 crc kubenswrapper[4923]: I0224 03:09:36.260272 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fd27016-8094-45ad-9298-4f33e7692b7e-bundle" (OuterVolumeSpecName: "bundle") pod "0fd27016-8094-45ad-9298-4f33e7692b7e" (UID: "0fd27016-8094-45ad-9298-4f33e7692b7e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:09:36 crc kubenswrapper[4923]: I0224 03:09:36.264625 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fd27016-8094-45ad-9298-4f33e7692b7e-kube-api-access-6bp2v" (OuterVolumeSpecName: "kube-api-access-6bp2v") pod "0fd27016-8094-45ad-9298-4f33e7692b7e" (UID: "0fd27016-8094-45ad-9298-4f33e7692b7e"). InnerVolumeSpecName "kube-api-access-6bp2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:09:36 crc kubenswrapper[4923]: I0224 03:09:36.272323 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fd27016-8094-45ad-9298-4f33e7692b7e-util" (OuterVolumeSpecName: "util") pod "0fd27016-8094-45ad-9298-4f33e7692b7e" (UID: "0fd27016-8094-45ad-9298-4f33e7692b7e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:09:36 crc kubenswrapper[4923]: I0224 03:09:36.360888 4923 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0fd27016-8094-45ad-9298-4f33e7692b7e-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:09:36 crc kubenswrapper[4923]: I0224 03:09:36.360938 4923 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0fd27016-8094-45ad-9298-4f33e7692b7e-util\") on node \"crc\" DevicePath \"\"" Feb 24 03:09:36 crc kubenswrapper[4923]: I0224 03:09:36.360958 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bp2v\" (UniqueName: \"kubernetes.io/projected/0fd27016-8094-45ad-9298-4f33e7692b7e-kube-api-access-6bp2v\") on node \"crc\" DevicePath \"\"" Feb 24 03:09:36 crc kubenswrapper[4923]: I0224 03:09:36.854886 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t" event={"ID":"0fd27016-8094-45ad-9298-4f33e7692b7e","Type":"ContainerDied","Data":"aed47e8d94c143673c00b6d3223e85dd8aa59af0e470314227d74753c64331d9"} Feb 24 03:09:36 crc kubenswrapper[4923]: I0224 03:09:36.854927 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aed47e8d94c143673c00b6d3223e85dd8aa59af0e470314227d74753c64331d9" Feb 24 03:09:36 crc kubenswrapper[4923]: I0224 03:09:36.855239 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t" Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.114685 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-98cjm"] Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.115011 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-98cjm" podUID="a60fe915-562a-4824-99f5-5875fceebcf9" containerName="registry-server" containerID="cri-o://c46d6bad162aeec3ea7a1e338eb5d0f90b289e4e938cf3dac8eb2b9571462d89" gracePeriod=2 Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.467751 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-98cjm" Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.575389 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj45g\" (UniqueName: \"kubernetes.io/projected/a60fe915-562a-4824-99f5-5875fceebcf9-kube-api-access-qj45g\") pod \"a60fe915-562a-4824-99f5-5875fceebcf9\" (UID: \"a60fe915-562a-4824-99f5-5875fceebcf9\") " Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.575566 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a60fe915-562a-4824-99f5-5875fceebcf9-utilities\") pod \"a60fe915-562a-4824-99f5-5875fceebcf9\" (UID: \"a60fe915-562a-4824-99f5-5875fceebcf9\") " Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.575602 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a60fe915-562a-4824-99f5-5875fceebcf9-catalog-content\") pod \"a60fe915-562a-4824-99f5-5875fceebcf9\" (UID: \"a60fe915-562a-4824-99f5-5875fceebcf9\") " Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.576546 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a60fe915-562a-4824-99f5-5875fceebcf9-utilities" (OuterVolumeSpecName: "utilities") pod "a60fe915-562a-4824-99f5-5875fceebcf9" (UID: "a60fe915-562a-4824-99f5-5875fceebcf9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.610215 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a60fe915-562a-4824-99f5-5875fceebcf9-kube-api-access-qj45g" (OuterVolumeSpecName: "kube-api-access-qj45g") pod "a60fe915-562a-4824-99f5-5875fceebcf9" (UID: "a60fe915-562a-4824-99f5-5875fceebcf9"). InnerVolumeSpecName "kube-api-access-qj45g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.677244 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a60fe915-562a-4824-99f5-5875fceebcf9-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.677276 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj45g\" (UniqueName: \"kubernetes.io/projected/a60fe915-562a-4824-99f5-5875fceebcf9-kube-api-access-qj45g\") on node \"crc\" DevicePath \"\"" Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.763077 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a60fe915-562a-4824-99f5-5875fceebcf9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a60fe915-562a-4824-99f5-5875fceebcf9" (UID: "a60fe915-562a-4824-99f5-5875fceebcf9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.777936 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a60fe915-562a-4824-99f5-5875fceebcf9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.862052 4923 generic.go:334] "Generic (PLEG): container finished" podID="a60fe915-562a-4824-99f5-5875fceebcf9" containerID="c46d6bad162aeec3ea7a1e338eb5d0f90b289e4e938cf3dac8eb2b9571462d89" exitCode=0 Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.862094 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98cjm" event={"ID":"a60fe915-562a-4824-99f5-5875fceebcf9","Type":"ContainerDied","Data":"c46d6bad162aeec3ea7a1e338eb5d0f90b289e4e938cf3dac8eb2b9571462d89"} Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.862158 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98cjm" event={"ID":"a60fe915-562a-4824-99f5-5875fceebcf9","Type":"ContainerDied","Data":"be17e0d404a7b6c1d6e0ce4fdc57c23b04146c81f4885c5dc167a8e50e4b71b8"} Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.862180 4923 scope.go:117] "RemoveContainer" containerID="c46d6bad162aeec3ea7a1e338eb5d0f90b289e4e938cf3dac8eb2b9571462d89" Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.862124 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-98cjm" Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.883238 4923 scope.go:117] "RemoveContainer" containerID="ec77a8a089013650bd41d0c6df5a3194f4d53c8fddb7ea79a4b50ff3989fda80" Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.901399 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-98cjm"] Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.909920 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-98cjm"] Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.921742 4923 scope.go:117] "RemoveContainer" containerID="88db0598c9d63819cc1e9c91f9b0128f4da791e71cba34017eed753f493e8077" Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.937415 4923 scope.go:117] "RemoveContainer" containerID="c46d6bad162aeec3ea7a1e338eb5d0f90b289e4e938cf3dac8eb2b9571462d89" Feb 24 03:09:37 crc kubenswrapper[4923]: E0224 03:09:37.937774 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c46d6bad162aeec3ea7a1e338eb5d0f90b289e4e938cf3dac8eb2b9571462d89\": container with ID starting with c46d6bad162aeec3ea7a1e338eb5d0f90b289e4e938cf3dac8eb2b9571462d89 not found: ID does not exist" containerID="c46d6bad162aeec3ea7a1e338eb5d0f90b289e4e938cf3dac8eb2b9571462d89" Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.937824 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c46d6bad162aeec3ea7a1e338eb5d0f90b289e4e938cf3dac8eb2b9571462d89"} err="failed to get container status \"c46d6bad162aeec3ea7a1e338eb5d0f90b289e4e938cf3dac8eb2b9571462d89\": rpc error: code = NotFound desc = could not find container \"c46d6bad162aeec3ea7a1e338eb5d0f90b289e4e938cf3dac8eb2b9571462d89\": container with ID starting with c46d6bad162aeec3ea7a1e338eb5d0f90b289e4e938cf3dac8eb2b9571462d89 not found: ID does not exist" Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.937851 4923 scope.go:117] "RemoveContainer" containerID="ec77a8a089013650bd41d0c6df5a3194f4d53c8fddb7ea79a4b50ff3989fda80" Feb 24 03:09:37 crc kubenswrapper[4923]: E0224 03:09:37.938100 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec77a8a089013650bd41d0c6df5a3194f4d53c8fddb7ea79a4b50ff3989fda80\": container with ID starting with ec77a8a089013650bd41d0c6df5a3194f4d53c8fddb7ea79a4b50ff3989fda80 not found: ID does not exist" containerID="ec77a8a089013650bd41d0c6df5a3194f4d53c8fddb7ea79a4b50ff3989fda80" Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.938136 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec77a8a089013650bd41d0c6df5a3194f4d53c8fddb7ea79a4b50ff3989fda80"} err="failed to get container status \"ec77a8a089013650bd41d0c6df5a3194f4d53c8fddb7ea79a4b50ff3989fda80\": rpc error: code = NotFound desc = could not find container \"ec77a8a089013650bd41d0c6df5a3194f4d53c8fddb7ea79a4b50ff3989fda80\": container with ID starting with ec77a8a089013650bd41d0c6df5a3194f4d53c8fddb7ea79a4b50ff3989fda80 not found: ID does not exist" Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.938164 4923 scope.go:117] "RemoveContainer" containerID="88db0598c9d63819cc1e9c91f9b0128f4da791e71cba34017eed753f493e8077" Feb 24 03:09:37 crc kubenswrapper[4923]: E0224 03:09:37.938446 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88db0598c9d63819cc1e9c91f9b0128f4da791e71cba34017eed753f493e8077\": container with ID starting with 88db0598c9d63819cc1e9c91f9b0128f4da791e71cba34017eed753f493e8077 not found: ID does not exist" containerID="88db0598c9d63819cc1e9c91f9b0128f4da791e71cba34017eed753f493e8077" Feb 24 03:09:37 crc kubenswrapper[4923]: I0224 03:09:37.938477 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88db0598c9d63819cc1e9c91f9b0128f4da791e71cba34017eed753f493e8077"} err="failed to get container status \"88db0598c9d63819cc1e9c91f9b0128f4da791e71cba34017eed753f493e8077\": rpc error: code = NotFound desc = could not find container \"88db0598c9d63819cc1e9c91f9b0128f4da791e71cba34017eed753f493e8077\": container with ID starting with 88db0598c9d63819cc1e9c91f9b0128f4da791e71cba34017eed753f493e8077 not found: ID does not exist" Feb 24 03:09:39 crc kubenswrapper[4923]: I0224 03:09:39.620923 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5677cd7d77-l7zkx"] Feb 24 03:09:39 crc kubenswrapper[4923]: E0224 03:09:39.621377 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a60fe915-562a-4824-99f5-5875fceebcf9" containerName="extract-content" Feb 24 03:09:39 crc kubenswrapper[4923]: I0224 03:09:39.621387 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="a60fe915-562a-4824-99f5-5875fceebcf9" containerName="extract-content" Feb 24 03:09:39 crc kubenswrapper[4923]: E0224 03:09:39.621401 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a60fe915-562a-4824-99f5-5875fceebcf9" containerName="extract-utilities" Feb 24 03:09:39 crc kubenswrapper[4923]: I0224 03:09:39.621408 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="a60fe915-562a-4824-99f5-5875fceebcf9" containerName="extract-utilities" Feb 24 03:09:39 crc kubenswrapper[4923]: E0224 03:09:39.621417 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd27016-8094-45ad-9298-4f33e7692b7e" containerName="util" Feb 24 03:09:39 crc kubenswrapper[4923]: I0224 03:09:39.621425 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd27016-8094-45ad-9298-4f33e7692b7e" containerName="util" Feb 24 03:09:39 crc kubenswrapper[4923]: E0224 03:09:39.621433 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd27016-8094-45ad-9298-4f33e7692b7e" containerName="pull" Feb 24 03:09:39 crc kubenswrapper[4923]: I0224 03:09:39.621440 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd27016-8094-45ad-9298-4f33e7692b7e" containerName="pull" Feb 24 03:09:39 crc kubenswrapper[4923]: E0224 03:09:39.621450 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd27016-8094-45ad-9298-4f33e7692b7e" containerName="extract" Feb 24 03:09:39 crc kubenswrapper[4923]: I0224 03:09:39.621455 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd27016-8094-45ad-9298-4f33e7692b7e" containerName="extract" Feb 24 03:09:39 crc kubenswrapper[4923]: E0224 03:09:39.621462 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a60fe915-562a-4824-99f5-5875fceebcf9" containerName="registry-server" Feb 24 03:09:39 crc kubenswrapper[4923]: I0224 03:09:39.621468 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="a60fe915-562a-4824-99f5-5875fceebcf9" containerName="registry-server" Feb 24 03:09:39 crc kubenswrapper[4923]: I0224 03:09:39.621567 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="a60fe915-562a-4824-99f5-5875fceebcf9" containerName="registry-server" Feb 24 03:09:39 crc kubenswrapper[4923]: I0224 03:09:39.621577 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd27016-8094-45ad-9298-4f33e7692b7e" containerName="extract" Feb 24 03:09:39 crc kubenswrapper[4923]: I0224 03:09:39.621924 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5677cd7d77-l7zkx" Feb 24 03:09:39 crc kubenswrapper[4923]: I0224 03:09:39.623898 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-2ss9z" Feb 24 03:09:39 crc kubenswrapper[4923]: I0224 03:09:39.642205 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5677cd7d77-l7zkx"] Feb 24 03:09:39 crc kubenswrapper[4923]: I0224 03:09:39.707280 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngrlc\" (UniqueName: \"kubernetes.io/projected/357fa046-8d84-43b5-8e3c-d1fe18f5d2c5-kube-api-access-ngrlc\") pod \"openstack-operator-controller-init-5677cd7d77-l7zkx\" (UID: \"357fa046-8d84-43b5-8e3c-d1fe18f5d2c5\") " pod="openstack-operators/openstack-operator-controller-init-5677cd7d77-l7zkx" Feb 24 03:09:39 crc kubenswrapper[4923]: I0224 03:09:39.719182 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a60fe915-562a-4824-99f5-5875fceebcf9" path="/var/lib/kubelet/pods/a60fe915-562a-4824-99f5-5875fceebcf9/volumes" Feb 24 03:09:39 crc kubenswrapper[4923]: I0224 03:09:39.808473 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngrlc\" (UniqueName: \"kubernetes.io/projected/357fa046-8d84-43b5-8e3c-d1fe18f5d2c5-kube-api-access-ngrlc\") pod \"openstack-operator-controller-init-5677cd7d77-l7zkx\" (UID: \"357fa046-8d84-43b5-8e3c-d1fe18f5d2c5\") " pod="openstack-operators/openstack-operator-controller-init-5677cd7d77-l7zkx" Feb 24 03:09:39 crc kubenswrapper[4923]: I0224 03:09:39.826180 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngrlc\" (UniqueName: \"kubernetes.io/projected/357fa046-8d84-43b5-8e3c-d1fe18f5d2c5-kube-api-access-ngrlc\") pod \"openstack-operator-controller-init-5677cd7d77-l7zkx\" (UID: \"357fa046-8d84-43b5-8e3c-d1fe18f5d2c5\") " pod="openstack-operators/openstack-operator-controller-init-5677cd7d77-l7zkx" Feb 24 03:09:39 crc kubenswrapper[4923]: I0224 03:09:39.940891 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5677cd7d77-l7zkx" Feb 24 03:09:40 crc kubenswrapper[4923]: I0224 03:09:40.231073 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5677cd7d77-l7zkx"] Feb 24 03:09:40 crc kubenswrapper[4923]: W0224 03:09:40.246589 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod357fa046_8d84_43b5_8e3c_d1fe18f5d2c5.slice/crio-e110da9c428964e2c796fb4f14122f23d7cd434a3093c5c0b52fa2be1dc06b9d WatchSource:0}: Error finding container e110da9c428964e2c796fb4f14122f23d7cd434a3093c5c0b52fa2be1dc06b9d: Status 404 returned error can't find the container with id e110da9c428964e2c796fb4f14122f23d7cd434a3093c5c0b52fa2be1dc06b9d Feb 24 03:09:40 crc kubenswrapper[4923]: I0224 03:09:40.888691 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5677cd7d77-l7zkx" event={"ID":"357fa046-8d84-43b5-8e3c-d1fe18f5d2c5","Type":"ContainerStarted","Data":"e110da9c428964e2c796fb4f14122f23d7cd434a3093c5c0b52fa2be1dc06b9d"} Feb 24 03:09:44 crc kubenswrapper[4923]: I0224 03:09:44.937595 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5677cd7d77-l7zkx" event={"ID":"357fa046-8d84-43b5-8e3c-d1fe18f5d2c5","Type":"ContainerStarted","Data":"047c08a8810eba6e64071ef3510c46d6d6a78c820a3627a6d91b5ec1f3dfd6c1"} Feb 24 03:09:44 crc kubenswrapper[4923]: I0224 03:09:44.938133 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5677cd7d77-l7zkx" Feb 24 03:09:44 crc kubenswrapper[4923]: I0224 03:09:44.985005 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5677cd7d77-l7zkx" podStartSLOduration=2.375929098 podStartE2EDuration="5.984985636s" podCreationTimestamp="2026-02-24 03:09:39 +0000 UTC" firstStartedPulling="2026-02-24 03:09:40.248984405 +0000 UTC m=+904.266055218" lastFinishedPulling="2026-02-24 03:09:43.858040943 +0000 UTC m=+907.875111756" observedRunningTime="2026-02-24 03:09:44.977696534 +0000 UTC m=+908.994767397" watchObservedRunningTime="2026-02-24 03:09:44.984985636 +0000 UTC m=+909.002056459" Feb 24 03:09:49 crc kubenswrapper[4923]: I0224 03:09:49.916742 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:09:49 crc kubenswrapper[4923]: I0224 03:09:49.917398 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:09:49 crc kubenswrapper[4923]: I0224 03:09:49.949120 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5677cd7d77-l7zkx" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.490782 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-msrg6"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.492710 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-msrg6" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.495458 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-lnz72"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.496349 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-lnz72" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.496572 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-cbjsr" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.498036 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-xw92x" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.512694 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-msrg6"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.521187 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-n2qf9"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.522374 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-n2qf9" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.527247 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-qzrdq" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.550326 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-58z87"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.551255 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-58z87" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.555831 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-nb492" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.556795 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5xnl\" (UniqueName: \"kubernetes.io/projected/202e32ae-6025-43c3-90ee-d5a6ec2f7752-kube-api-access-w5xnl\") pod \"glance-operator-controller-manager-784b5bb6c5-58z87\" (UID: \"202e32ae-6025-43c3-90ee-d5a6ec2f7752\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-58z87" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.556969 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqv5j\" (UniqueName: \"kubernetes.io/projected/85db5c1e-a62f-496a-a8ce-0e32d4321ac9-kube-api-access-lqv5j\") pod \"cinder-operator-controller-manager-55d77d7b5c-msrg6\" (UID: \"85db5c1e-a62f-496a-a8ce-0e32d4321ac9\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-msrg6" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.557007 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjqbd\" (UniqueName: \"kubernetes.io/projected/74dbbb69-5b9b-45b1-a74e-8bed20a6cbed-kube-api-access-sjqbd\") pod \"designate-operator-controller-manager-6d8bf5c495-n2qf9\" (UID: \"74dbbb69-5b9b-45b1-a74e-8bed20a6cbed\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-n2qf9" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.557029 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9889\" (UniqueName: \"kubernetes.io/projected/13de02b1-8017-4d32-b848-08d241ef34d4-kube-api-access-f9889\") pod \"barbican-operator-controller-manager-868647ff47-lnz72\" (UID: \"13de02b1-8017-4d32-b848-08d241ef34d4\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-lnz72" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.568784 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-n2qf9"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.574566 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-tbhxh"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.575242 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-tbhxh" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.578905 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-vsxx8" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.590520 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-58z87"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.602870 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-lnz72"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.610712 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-tbhxh"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.619350 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2mdh9"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.620225 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2mdh9" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.622232 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4nvc4" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.634471 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.638481 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.639828 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.646390 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2mdh9"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.675326 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-tqvjv"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.682555 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5xnl\" (UniqueName: \"kubernetes.io/projected/202e32ae-6025-43c3-90ee-d5a6ec2f7752-kube-api-access-w5xnl\") pod \"glance-operator-controller-manager-784b5bb6c5-58z87\" (UID: \"202e32ae-6025-43c3-90ee-d5a6ec2f7752\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-58z87" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.682642 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2ccz\" (UniqueName: \"kubernetes.io/projected/2dda58b4-8524-47cf-9e31-f276859d0af1-kube-api-access-j2ccz\") pod \"horizon-operator-controller-manager-5b9b8895d5-2mdh9\" (UID: \"2dda58b4-8524-47cf-9e31-f276859d0af1\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2mdh9" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.682719 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p62nk\" (UniqueName: \"kubernetes.io/projected/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-kube-api-access-p62nk\") pod \"infra-operator-controller-manager-79d975b745-bmn7v\" (UID: \"26fc13d5-b98a-49ac-8c62-ee3ab08a9767\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.682748 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bnw8\" (UniqueName: \"kubernetes.io/projected/07169ece-c03c-464a-9899-f03b61426df5-kube-api-access-6bnw8\") pod \"heat-operator-controller-manager-69f49c598c-tbhxh\" (UID: \"07169ece-c03c-464a-9899-f03b61426df5\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-tbhxh" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.682780 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqv5j\" (UniqueName: \"kubernetes.io/projected/85db5c1e-a62f-496a-a8ce-0e32d4321ac9-kube-api-access-lqv5j\") pod \"cinder-operator-controller-manager-55d77d7b5c-msrg6\" (UID: \"85db5c1e-a62f-496a-a8ce-0e32d4321ac9\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-msrg6" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.682811 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjqbd\" (UniqueName: \"kubernetes.io/projected/74dbbb69-5b9b-45b1-a74e-8bed20a6cbed-kube-api-access-sjqbd\") pod \"designate-operator-controller-manager-6d8bf5c495-n2qf9\" (UID: \"74dbbb69-5b9b-45b1-a74e-8bed20a6cbed\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-n2qf9" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.682838 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert\") pod \"infra-operator-controller-manager-79d975b745-bmn7v\" (UID: \"26fc13d5-b98a-49ac-8c62-ee3ab08a9767\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.682862 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9889\" (UniqueName: \"kubernetes.io/projected/13de02b1-8017-4d32-b848-08d241ef34d4-kube-api-access-f9889\") pod \"barbican-operator-controller-manager-868647ff47-lnz72\" (UID: \"13de02b1-8017-4d32-b848-08d241ef34d4\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-lnz72" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.683385 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.684518 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-b9sx9" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.729978 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tqvjv" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.738428 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-jrtlk" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.740082 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjqbd\" (UniqueName: \"kubernetes.io/projected/74dbbb69-5b9b-45b1-a74e-8bed20a6cbed-kube-api-access-sjqbd\") pod \"designate-operator-controller-manager-6d8bf5c495-n2qf9\" (UID: \"74dbbb69-5b9b-45b1-a74e-8bed20a6cbed\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-n2qf9" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.760860 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9889\" (UniqueName: \"kubernetes.io/projected/13de02b1-8017-4d32-b848-08d241ef34d4-kube-api-access-f9889\") pod \"barbican-operator-controller-manager-868647ff47-lnz72\" (UID: \"13de02b1-8017-4d32-b848-08d241ef34d4\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-lnz72" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.761371 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5xnl\" (UniqueName: \"kubernetes.io/projected/202e32ae-6025-43c3-90ee-d5a6ec2f7752-kube-api-access-w5xnl\") pod \"glance-operator-controller-manager-784b5bb6c5-58z87\" (UID: \"202e32ae-6025-43c3-90ee-d5a6ec2f7752\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-58z87" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.762905 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqv5j\" (UniqueName: \"kubernetes.io/projected/85db5c1e-a62f-496a-a8ce-0e32d4321ac9-kube-api-access-lqv5j\") pod \"cinder-operator-controller-manager-55d77d7b5c-msrg6\" (UID: \"85db5c1e-a62f-496a-a8ce-0e32d4321ac9\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-msrg6" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.771745 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-pd2zn"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.772525 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-tqvjv"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.772556 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-4d7n7"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.772727 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-pd2zn" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.773138 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4d7n7" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.779485 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-qdrm6" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.784345 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-pd2zn"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.784942 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2ccz\" (UniqueName: \"kubernetes.io/projected/2dda58b4-8524-47cf-9e31-f276859d0af1-kube-api-access-j2ccz\") pod \"horizon-operator-controller-manager-5b9b8895d5-2mdh9\" (UID: \"2dda58b4-8524-47cf-9e31-f276859d0af1\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2mdh9" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.784988 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p62nk\" (UniqueName: \"kubernetes.io/projected/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-kube-api-access-p62nk\") pod \"infra-operator-controller-manager-79d975b745-bmn7v\" (UID: \"26fc13d5-b98a-49ac-8c62-ee3ab08a9767\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.785010 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bnw8\" (UniqueName: \"kubernetes.io/projected/07169ece-c03c-464a-9899-f03b61426df5-kube-api-access-6bnw8\") pod \"heat-operator-controller-manager-69f49c598c-tbhxh\" (UID: \"07169ece-c03c-464a-9899-f03b61426df5\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-tbhxh" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.785040 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert\") pod \"infra-operator-controller-manager-79d975b745-bmn7v\" (UID: \"26fc13d5-b98a-49ac-8c62-ee3ab08a9767\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" Feb 24 03:10:09 crc kubenswrapper[4923]: E0224 03:10:09.785136 4923 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 24 03:10:09 crc kubenswrapper[4923]: E0224 03:10:09.785180 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert podName:26fc13d5-b98a-49ac-8c62-ee3ab08a9767 nodeName:}" failed. No retries permitted until 2026-02-24 03:10:10.285163286 +0000 UTC m=+934.302234089 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert") pod "infra-operator-controller-manager-79d975b745-bmn7v" (UID: "26fc13d5-b98a-49ac-8c62-ee3ab08a9767") : secret "infra-operator-webhook-server-cert" not found Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.793404 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-pz8g9" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.798417 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-4d7n7"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.817909 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-4c6hc"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.818743 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4c6hc" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.819467 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-msrg6" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.820963 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-5dql7" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.828569 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-4c6hc"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.829570 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-lnz72" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.835202 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p62nk\" (UniqueName: \"kubernetes.io/projected/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-kube-api-access-p62nk\") pod \"infra-operator-controller-manager-79d975b745-bmn7v\" (UID: \"26fc13d5-b98a-49ac-8c62-ee3ab08a9767\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.840041 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2ccz\" (UniqueName: \"kubernetes.io/projected/2dda58b4-8524-47cf-9e31-f276859d0af1-kube-api-access-j2ccz\") pod \"horizon-operator-controller-manager-5b9b8895d5-2mdh9\" (UID: \"2dda58b4-8524-47cf-9e31-f276859d0af1\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2mdh9" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.852926 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-n2qf9" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.853364 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-x6vmb"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.854048 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-x6vmb" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.856052 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-x6vmb"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.858182 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-nq82q" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.861848 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-c8n7r"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.862602 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-c8n7r" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.864452 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bnw8\" (UniqueName: \"kubernetes.io/projected/07169ece-c03c-464a-9899-f03b61426df5-kube-api-access-6bnw8\") pod \"heat-operator-controller-manager-69f49c598c-tbhxh\" (UID: \"07169ece-c03c-464a-9899-f03b61426df5\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-tbhxh" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.867542 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-rw88w" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.868698 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-c8n7r"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.873474 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-r64q5"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.874197 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-r64q5" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.874720 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-58z87" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.885772 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-hddlk" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.889610 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.889884 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9lmd\" (UniqueName: \"kubernetes.io/projected/7228fc47-38cb-4680-9104-d5657a853147-kube-api-access-d9lmd\") pod \"manila-operator-controller-manager-67d996989d-4d7n7\" (UID: \"7228fc47-38cb-4680-9104-d5657a853147\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-4d7n7" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.889937 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc6qf\" (UniqueName: \"kubernetes.io/projected/6a78564f-37a8-4385-9f93-57ee3952d36c-kube-api-access-qc6qf\") pod \"keystone-operator-controller-manager-b4d948c87-tqvjv\" (UID: \"6a78564f-37a8-4385-9f93-57ee3952d36c\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tqvjv" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.890015 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw82k\" (UniqueName: \"kubernetes.io/projected/1dc2eca6-9aa4-4b34-a30c-bb88a78fe1d1-kube-api-access-kw82k\") pod \"ironic-operator-controller-manager-554564d7fc-pd2zn\" (UID: \"1dc2eca6-9aa4-4b34-a30c-bb88a78fe1d1\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-pd2zn" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.890063 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96qjq\" (UniqueName: \"kubernetes.io/projected/400c9c7a-a90c-4b16-b13d-25c26be22f93-kube-api-access-96qjq\") pod \"mariadb-operator-controller-manager-6994f66f48-4c6hc\" (UID: \"400c9c7a-a90c-4b16-b13d-25c26be22f93\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4c6hc" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.890727 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.895599 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.896761 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-r64q5"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.911183 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-djsbj" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.911477 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-tbhxh" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.921762 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-2zxl4"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.922557 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2zxl4" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.936654 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-2lbzh" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.950417 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2mdh9" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.954388 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.978713 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-m8tjf"] Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.979481 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-m8tjf" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.988160 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-gwjqm" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.991079 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96qjq\" (UniqueName: \"kubernetes.io/projected/400c9c7a-a90c-4b16-b13d-25c26be22f93-kube-api-access-96qjq\") pod \"mariadb-operator-controller-manager-6994f66f48-4c6hc\" (UID: \"400c9c7a-a90c-4b16-b13d-25c26be22f93\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4c6hc" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.991152 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9lmd\" (UniqueName: \"kubernetes.io/projected/7228fc47-38cb-4680-9104-d5657a853147-kube-api-access-d9lmd\") pod \"manila-operator-controller-manager-67d996989d-4d7n7\" (UID: \"7228fc47-38cb-4680-9104-d5657a853147\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-4d7n7" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.991190 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc6qf\" (UniqueName: \"kubernetes.io/projected/6a78564f-37a8-4385-9f93-57ee3952d36c-kube-api-access-qc6qf\") pod \"keystone-operator-controller-manager-b4d948c87-tqvjv\" (UID: \"6a78564f-37a8-4385-9f93-57ee3952d36c\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tqvjv" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.991217 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-926jr\" (UniqueName: \"kubernetes.io/projected/31f694c1-3948-4e87-90d1-5bd1e7d0aef6-kube-api-access-926jr\") pod \"octavia-operator-controller-manager-659dc6bbfc-r64q5\" (UID: \"31f694c1-3948-4e87-90d1-5bd1e7d0aef6\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-r64q5" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.991275 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkk6v\" (UniqueName: \"kubernetes.io/projected/8c5a7840-9e6b-4442-b99e-1ce50bff0722-kube-api-access-bkk6v\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw\" (UID: \"8c5a7840-9e6b-4442-b99e-1ce50bff0722\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.991357 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln2jm\" (UniqueName: \"kubernetes.io/projected/063cbd60-dc19-4c47-96ca-7b9cb24bf2ef-kube-api-access-ln2jm\") pod \"nova-operator-controller-manager-567668f5cf-c8n7r\" (UID: \"063cbd60-dc19-4c47-96ca-7b9cb24bf2ef\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-c8n7r" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.991408 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz6kl\" (UniqueName: \"kubernetes.io/projected/039de08e-513c-47e3-a3f7-59b8911b7dae-kube-api-access-mz6kl\") pod \"neutron-operator-controller-manager-6bd4687957-x6vmb\" (UID: \"039de08e-513c-47e3-a3f7-59b8911b7dae\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-x6vmb" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.991452 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw82k\" (UniqueName: \"kubernetes.io/projected/1dc2eca6-9aa4-4b34-a30c-bb88a78fe1d1-kube-api-access-kw82k\") pod \"ironic-operator-controller-manager-554564d7fc-pd2zn\" (UID: \"1dc2eca6-9aa4-4b34-a30c-bb88a78fe1d1\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-pd2zn" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.991485 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw\" (UID: \"8c5a7840-9e6b-4442-b99e-1ce50bff0722\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" Feb 24 03:10:09 crc kubenswrapper[4923]: I0224 03:10:09.991520 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb4rp\" (UniqueName: \"kubernetes.io/projected/8b9a0e9e-0ef9-4b69-87f3-63cfb4204996-kube-api-access-xb4rp\") pod \"ovn-operator-controller-manager-5955d8c787-2zxl4\" (UID: \"8b9a0e9e-0ef9-4b69-87f3-63cfb4204996\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2zxl4" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.017396 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-f4b5c"] Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.018366 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f4b5c" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.025255 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-h2g82" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.028435 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-2zxl4"] Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.029099 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc6qf\" (UniqueName: \"kubernetes.io/projected/6a78564f-37a8-4385-9f93-57ee3952d36c-kube-api-access-qc6qf\") pod \"keystone-operator-controller-manager-b4d948c87-tqvjv\" (UID: \"6a78564f-37a8-4385-9f93-57ee3952d36c\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tqvjv" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.033719 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-zxhc7"] Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.034898 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-zxhc7" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.040287 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-f4b5c"] Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.041214 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-7pbzr" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.047308 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96qjq\" (UniqueName: \"kubernetes.io/projected/400c9c7a-a90c-4b16-b13d-25c26be22f93-kube-api-access-96qjq\") pod \"mariadb-operator-controller-manager-6994f66f48-4c6hc\" (UID: \"400c9c7a-a90c-4b16-b13d-25c26be22f93\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4c6hc" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.050655 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-m8tjf"] Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.051371 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9lmd\" (UniqueName: \"kubernetes.io/projected/7228fc47-38cb-4680-9104-d5657a853147-kube-api-access-d9lmd\") pod \"manila-operator-controller-manager-67d996989d-4d7n7\" (UID: \"7228fc47-38cb-4680-9104-d5657a853147\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-4d7n7" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.058167 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw82k\" (UniqueName: \"kubernetes.io/projected/1dc2eca6-9aa4-4b34-a30c-bb88a78fe1d1-kube-api-access-kw82k\") pod \"ironic-operator-controller-manager-554564d7fc-pd2zn\" (UID: \"1dc2eca6-9aa4-4b34-a30c-bb88a78fe1d1\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-pd2zn" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.073946 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-zxhc7"] Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.088864 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-5pzgk"] Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.090927 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-5pzgk" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.094752 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz6kl\" (UniqueName: \"kubernetes.io/projected/039de08e-513c-47e3-a3f7-59b8911b7dae-kube-api-access-mz6kl\") pod \"neutron-operator-controller-manager-6bd4687957-x6vmb\" (UID: \"039de08e-513c-47e3-a3f7-59b8911b7dae\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-x6vmb" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.094793 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn5gn\" (UniqueName: \"kubernetes.io/projected/4981f2a3-3977-40b7-819b-59cf400fa882-kube-api-access-pn5gn\") pod \"telemetry-operator-controller-manager-589c568786-f4b5c\" (UID: \"4981f2a3-3977-40b7-819b-59cf400fa882\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f4b5c" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.094825 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw\" (UID: \"8c5a7840-9e6b-4442-b99e-1ce50bff0722\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.094840 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v454w\" (UniqueName: \"kubernetes.io/projected/d368aeb3-8ad3-4ed2-8022-6dfb08b2f86e-kube-api-access-v454w\") pod \"placement-operator-controller-manager-8497b45c89-m8tjf\" (UID: \"d368aeb3-8ad3-4ed2-8022-6dfb08b2f86e\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-m8tjf" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.094864 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb4rp\" (UniqueName: \"kubernetes.io/projected/8b9a0e9e-0ef9-4b69-87f3-63cfb4204996-kube-api-access-xb4rp\") pod \"ovn-operator-controller-manager-5955d8c787-2zxl4\" (UID: \"8b9a0e9e-0ef9-4b69-87f3-63cfb4204996\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2zxl4" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.094900 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-926jr\" (UniqueName: \"kubernetes.io/projected/31f694c1-3948-4e87-90d1-5bd1e7d0aef6-kube-api-access-926jr\") pod \"octavia-operator-controller-manager-659dc6bbfc-r64q5\" (UID: \"31f694c1-3948-4e87-90d1-5bd1e7d0aef6\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-r64q5" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.094939 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkk6v\" (UniqueName: \"kubernetes.io/projected/8c5a7840-9e6b-4442-b99e-1ce50bff0722-kube-api-access-bkk6v\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw\" (UID: \"8c5a7840-9e6b-4442-b99e-1ce50bff0722\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.094961 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln2jm\" (UniqueName: \"kubernetes.io/projected/063cbd60-dc19-4c47-96ca-7b9cb24bf2ef-kube-api-access-ln2jm\") pod \"nova-operator-controller-manager-567668f5cf-c8n7r\" (UID: \"063cbd60-dc19-4c47-96ca-7b9cb24bf2ef\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-c8n7r" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.094980 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn5vd\" (UniqueName: \"kubernetes.io/projected/a9542edd-79bc-4eb0-8afb-7e9c61a8fb6e-kube-api-access-fn5vd\") pod \"swift-operator-controller-manager-68f46476f-zxhc7\" (UID: \"a9542edd-79bc-4eb0-8afb-7e9c61a8fb6e\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-zxhc7" Feb 24 03:10:10 crc kubenswrapper[4923]: E0224 03:10:10.095252 4923 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 24 03:10:10 crc kubenswrapper[4923]: E0224 03:10:10.095310 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert podName:8c5a7840-9e6b-4442-b99e-1ce50bff0722 nodeName:}" failed. No retries permitted until 2026-02-24 03:10:10.595279807 +0000 UTC m=+934.612350620 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" (UID: "8c5a7840-9e6b-4442-b99e-1ce50bff0722") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.101373 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tqvjv" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.101607 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-lwqkt" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.114619 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-5pzgk"] Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.114878 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-pd2zn" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.135445 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln2jm\" (UniqueName: \"kubernetes.io/projected/063cbd60-dc19-4c47-96ca-7b9cb24bf2ef-kube-api-access-ln2jm\") pod \"nova-operator-controller-manager-567668f5cf-c8n7r\" (UID: \"063cbd60-dc19-4c47-96ca-7b9cb24bf2ef\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-c8n7r" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.137291 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-926jr\" (UniqueName: \"kubernetes.io/projected/31f694c1-3948-4e87-90d1-5bd1e7d0aef6-kube-api-access-926jr\") pod \"octavia-operator-controller-manager-659dc6bbfc-r64q5\" (UID: \"31f694c1-3948-4e87-90d1-5bd1e7d0aef6\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-r64q5" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.137418 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb4rp\" (UniqueName: \"kubernetes.io/projected/8b9a0e9e-0ef9-4b69-87f3-63cfb4204996-kube-api-access-xb4rp\") pod \"ovn-operator-controller-manager-5955d8c787-2zxl4\" (UID: \"8b9a0e9e-0ef9-4b69-87f3-63cfb4204996\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2zxl4" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.142058 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkk6v\" (UniqueName: \"kubernetes.io/projected/8c5a7840-9e6b-4442-b99e-1ce50bff0722-kube-api-access-bkk6v\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw\" (UID: \"8c5a7840-9e6b-4442-b99e-1ce50bff0722\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.183140 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-2bgtt"] Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.189179 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4d7n7" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.217579 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-2bgtt"] Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.217677 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-2bgtt" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.218075 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn5gn\" (UniqueName: \"kubernetes.io/projected/4981f2a3-3977-40b7-819b-59cf400fa882-kube-api-access-pn5gn\") pod \"telemetry-operator-controller-manager-589c568786-f4b5c\" (UID: \"4981f2a3-3977-40b7-819b-59cf400fa882\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f4b5c" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.218141 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcztp\" (UniqueName: \"kubernetes.io/projected/4ba7a21e-9aef-4596-9e18-21c66394cf74-kube-api-access-wcztp\") pod \"test-operator-controller-manager-5dc6794d5b-5pzgk\" (UID: \"4ba7a21e-9aef-4596-9e18-21c66394cf74\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-5pzgk" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.218215 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v454w\" (UniqueName: \"kubernetes.io/projected/d368aeb3-8ad3-4ed2-8022-6dfb08b2f86e-kube-api-access-v454w\") pod \"placement-operator-controller-manager-8497b45c89-m8tjf\" (UID: \"d368aeb3-8ad3-4ed2-8022-6dfb08b2f86e\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-m8tjf" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.218415 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn5vd\" (UniqueName: \"kubernetes.io/projected/a9542edd-79bc-4eb0-8afb-7e9c61a8fb6e-kube-api-access-fn5vd\") pod \"swift-operator-controller-manager-68f46476f-zxhc7\" (UID: \"a9542edd-79bc-4eb0-8afb-7e9c61a8fb6e\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-zxhc7" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.218870 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4c6hc" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.225836 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-b7dtf" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.244757 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz6kl\" (UniqueName: \"kubernetes.io/projected/039de08e-513c-47e3-a3f7-59b8911b7dae-kube-api-access-mz6kl\") pod \"neutron-operator-controller-manager-6bd4687957-x6vmb\" (UID: \"039de08e-513c-47e3-a3f7-59b8911b7dae\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-x6vmb" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.246345 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-c8n7r" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.273092 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn5gn\" (UniqueName: \"kubernetes.io/projected/4981f2a3-3977-40b7-819b-59cf400fa882-kube-api-access-pn5gn\") pod \"telemetry-operator-controller-manager-589c568786-f4b5c\" (UID: \"4981f2a3-3977-40b7-819b-59cf400fa882\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f4b5c" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.273220 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn5vd\" (UniqueName: \"kubernetes.io/projected/a9542edd-79bc-4eb0-8afb-7e9c61a8fb6e-kube-api-access-fn5vd\") pod \"swift-operator-controller-manager-68f46476f-zxhc7\" (UID: \"a9542edd-79bc-4eb0-8afb-7e9c61a8fb6e\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-zxhc7" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.273245 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v454w\" (UniqueName: \"kubernetes.io/projected/d368aeb3-8ad3-4ed2-8022-6dfb08b2f86e-kube-api-access-v454w\") pod \"placement-operator-controller-manager-8497b45c89-m8tjf\" (UID: \"d368aeb3-8ad3-4ed2-8022-6dfb08b2f86e\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-m8tjf" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.275985 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-r64q5" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.300617 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2zxl4" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.323479 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert\") pod \"infra-operator-controller-manager-79d975b745-bmn7v\" (UID: \"26fc13d5-b98a-49ac-8c62-ee3ab08a9767\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.325269 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8dbl\" (UniqueName: \"kubernetes.io/projected/ae81debf-3361-424d-afbe-9e4521997d23-kube-api-access-q8dbl\") pod \"watcher-operator-controller-manager-bccc79885-2bgtt\" (UID: \"ae81debf-3361-424d-afbe-9e4521997d23\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-2bgtt" Feb 24 03:10:10 crc kubenswrapper[4923]: E0224 03:10:10.323726 4923 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 24 03:10:10 crc kubenswrapper[4923]: E0224 03:10:10.325523 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert podName:26fc13d5-b98a-49ac-8c62-ee3ab08a9767 nodeName:}" failed. No retries permitted until 2026-02-24 03:10:11.325497147 +0000 UTC m=+935.342567960 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert") pod "infra-operator-controller-manager-79d975b745-bmn7v" (UID: "26fc13d5-b98a-49ac-8c62-ee3ab08a9767") : secret "infra-operator-webhook-server-cert" not found Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.325706 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcztp\" (UniqueName: \"kubernetes.io/projected/4ba7a21e-9aef-4596-9e18-21c66394cf74-kube-api-access-wcztp\") pod \"test-operator-controller-manager-5dc6794d5b-5pzgk\" (UID: \"4ba7a21e-9aef-4596-9e18-21c66394cf74\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-5pzgk" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.346413 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn"] Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.347281 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.350060 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-wr8bb" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.350225 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.355410 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.356002 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcztp\" (UniqueName: \"kubernetes.io/projected/4ba7a21e-9aef-4596-9e18-21c66394cf74-kube-api-access-wcztp\") pod \"test-operator-controller-manager-5dc6794d5b-5pzgk\" (UID: \"4ba7a21e-9aef-4596-9e18-21c66394cf74\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-5pzgk" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.380997 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-m8tjf" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.424185 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f4b5c" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.424712 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn"] Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.428036 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.428088 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.428143 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qwkl\" (UniqueName: \"kubernetes.io/projected/6b2c8692-7382-4716-8770-47d21209898f-kube-api-access-8qwkl\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.428180 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8dbl\" (UniqueName: \"kubernetes.io/projected/ae81debf-3361-424d-afbe-9e4521997d23-kube-api-access-q8dbl\") pod \"watcher-operator-controller-manager-bccc79885-2bgtt\" (UID: \"ae81debf-3361-424d-afbe-9e4521997d23\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-2bgtt" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.441641 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vczgc"] Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.442506 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vczgc" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.460209 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8dbl\" (UniqueName: \"kubernetes.io/projected/ae81debf-3361-424d-afbe-9e4521997d23-kube-api-access-q8dbl\") pod \"watcher-operator-controller-manager-bccc79885-2bgtt\" (UID: \"ae81debf-3361-424d-afbe-9e4521997d23\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-2bgtt" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.460280 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-zv5z7" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.461807 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vczgc"] Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.504747 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-zxhc7" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.528928 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qwkl\" (UniqueName: \"kubernetes.io/projected/6b2c8692-7382-4716-8770-47d21209898f-kube-api-access-8qwkl\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.529272 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.529310 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.529357 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz29d\" (UniqueName: \"kubernetes.io/projected/7e46cf81-12eb-4c37-9f04-affbd9f153b7-kube-api-access-nz29d\") pod \"rabbitmq-cluster-operator-manager-668c99d594-vczgc\" (UID: \"7e46cf81-12eb-4c37-9f04-affbd9f153b7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vczgc" Feb 24 03:10:10 crc kubenswrapper[4923]: E0224 03:10:10.529697 4923 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 24 03:10:10 crc kubenswrapper[4923]: E0224 03:10:10.529741 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs podName:6b2c8692-7382-4716-8770-47d21209898f nodeName:}" failed. No retries permitted until 2026-02-24 03:10:11.029730001 +0000 UTC m=+935.046800814 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs") pod "openstack-operator-controller-manager-5dd698895-qrccn" (UID: "6b2c8692-7382-4716-8770-47d21209898f") : secret "webhook-server-cert" not found Feb 24 03:10:10 crc kubenswrapper[4923]: E0224 03:10:10.529876 4923 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 24 03:10:10 crc kubenswrapper[4923]: E0224 03:10:10.529905 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs podName:6b2c8692-7382-4716-8770-47d21209898f nodeName:}" failed. No retries permitted until 2026-02-24 03:10:11.029898076 +0000 UTC m=+935.046968889 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs") pod "openstack-operator-controller-manager-5dd698895-qrccn" (UID: "6b2c8692-7382-4716-8770-47d21209898f") : secret "metrics-server-cert" not found Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.530047 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-x6vmb" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.550414 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qwkl\" (UniqueName: \"kubernetes.io/projected/6b2c8692-7382-4716-8770-47d21209898f-kube-api-access-8qwkl\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.570443 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-n2qf9"] Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.584192 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-5pzgk" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.612310 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-2bgtt" Feb 24 03:10:10 crc kubenswrapper[4923]: W0224 03:10:10.626230 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74dbbb69_5b9b_45b1_a74e_8bed20a6cbed.slice/crio-384a63ef9f5d7cd3b5885ffe9f67fc08ac712f5b836b3cc58870ffb31d7132fb WatchSource:0}: Error finding container 384a63ef9f5d7cd3b5885ffe9f67fc08ac712f5b836b3cc58870ffb31d7132fb: Status 404 returned error can't find the container with id 384a63ef9f5d7cd3b5885ffe9f67fc08ac712f5b836b3cc58870ffb31d7132fb Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.631061 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw\" (UID: \"8c5a7840-9e6b-4442-b99e-1ce50bff0722\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.631139 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz29d\" (UniqueName: \"kubernetes.io/projected/7e46cf81-12eb-4c37-9f04-affbd9f153b7-kube-api-access-nz29d\") pod \"rabbitmq-cluster-operator-manager-668c99d594-vczgc\" (UID: \"7e46cf81-12eb-4c37-9f04-affbd9f153b7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vczgc" Feb 24 03:10:10 crc kubenswrapper[4923]: E0224 03:10:10.631231 4923 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 24 03:10:10 crc kubenswrapper[4923]: E0224 03:10:10.631310 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert podName:8c5a7840-9e6b-4442-b99e-1ce50bff0722 nodeName:}" failed. No retries permitted until 2026-02-24 03:10:11.631281613 +0000 UTC m=+935.648352416 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" (UID: "8c5a7840-9e6b-4442-b99e-1ce50bff0722") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.649512 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz29d\" (UniqueName: \"kubernetes.io/projected/7e46cf81-12eb-4c37-9f04-affbd9f153b7-kube-api-access-nz29d\") pod \"rabbitmq-cluster-operator-manager-668c99d594-vczgc\" (UID: \"7e46cf81-12eb-4c37-9f04-affbd9f153b7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vczgc" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.776885 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vczgc" Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.823745 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-msrg6"] Feb 24 03:10:10 crc kubenswrapper[4923]: I0224 03:10:10.838488 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-lnz72"] Feb 24 03:10:10 crc kubenswrapper[4923]: W0224 03:10:10.858010 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13de02b1_8017_4d32_b848_08d241ef34d4.slice/crio-8f68c1c7d9b4fa1ace81c369fe935c85ab7372e5fbe4a82258e2e5e064e609c6 WatchSource:0}: Error finding container 8f68c1c7d9b4fa1ace81c369fe935c85ab7372e5fbe4a82258e2e5e064e609c6: Status 404 returned error can't find the container with id 8f68c1c7d9b4fa1ace81c369fe935c85ab7372e5fbe4a82258e2e5e064e609c6 Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.026416 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-58z87"] Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.030841 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-tbhxh"] Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.038075 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.038275 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.038218 4923 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.038461 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs podName:6b2c8692-7382-4716-8770-47d21209898f nodeName:}" failed. No retries permitted until 2026-02-24 03:10:12.038442456 +0000 UTC m=+936.055513269 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs") pod "openstack-operator-controller-manager-5dd698895-qrccn" (UID: "6b2c8692-7382-4716-8770-47d21209898f") : secret "webhook-server-cert" not found Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.038404 4923 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.038824 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs podName:6b2c8692-7382-4716-8770-47d21209898f nodeName:}" failed. No retries permitted until 2026-02-24 03:10:12.038812546 +0000 UTC m=+936.055883359 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs") pod "openstack-operator-controller-manager-5dd698895-qrccn" (UID: "6b2c8692-7382-4716-8770-47d21209898f") : secret "metrics-server-cert" not found Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.043708 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-pd2zn"] Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.088907 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2mdh9"] Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.096191 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-tqvjv"] Feb 24 03:10:11 crc kubenswrapper[4923]: W0224 03:10:11.104191 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a78564f_37a8_4385_9f93_57ee3952d36c.slice/crio-36cb812fbafa0ce049d64ac80821627a1327226772d9c412dd3b3fdb22c67e50 WatchSource:0}: Error finding container 36cb812fbafa0ce049d64ac80821627a1327226772d9c412dd3b3fdb22c67e50: Status 404 returned error can't find the container with id 36cb812fbafa0ce049d64ac80821627a1327226772d9c412dd3b3fdb22c67e50 Feb 24 03:10:11 crc kubenswrapper[4923]: W0224 03:10:11.106795 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dda58b4_8524_47cf_9e31_f276859d0af1.slice/crio-b0b7e2cc4cfb99c841a027deb9f52fdaa887e423ee463288d2765ea78c98f80e WatchSource:0}: Error finding container b0b7e2cc4cfb99c841a027deb9f52fdaa887e423ee463288d2765ea78c98f80e: Status 404 returned error can't find the container with id b0b7e2cc4cfb99c841a027deb9f52fdaa887e423ee463288d2765ea78c98f80e Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.217262 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tqvjv" event={"ID":"6a78564f-37a8-4385-9f93-57ee3952d36c","Type":"ContainerStarted","Data":"36cb812fbafa0ce049d64ac80821627a1327226772d9c412dd3b3fdb22c67e50"} Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.219432 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-n2qf9" event={"ID":"74dbbb69-5b9b-45b1-a74e-8bed20a6cbed","Type":"ContainerStarted","Data":"384a63ef9f5d7cd3b5885ffe9f67fc08ac712f5b836b3cc58870ffb31d7132fb"} Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.220390 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2mdh9" event={"ID":"2dda58b4-8524-47cf-9e31-f276859d0af1","Type":"ContainerStarted","Data":"b0b7e2cc4cfb99c841a027deb9f52fdaa887e423ee463288d2765ea78c98f80e"} Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.221196 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-pd2zn" event={"ID":"1dc2eca6-9aa4-4b34-a30c-bb88a78fe1d1","Type":"ContainerStarted","Data":"fa1d959907057e0c80725efe962a155fcfaec531738cec84624bb385068bbd69"} Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.223215 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-tbhxh" event={"ID":"07169ece-c03c-464a-9899-f03b61426df5","Type":"ContainerStarted","Data":"eed71429ece87fc71681ef8bc6fbf944f81365ae7332e4204fbab5c715b52c0f"} Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.231113 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-58z87" event={"ID":"202e32ae-6025-43c3-90ee-d5a6ec2f7752","Type":"ContainerStarted","Data":"833e7ad8e74b86c9f748b0570492e838cdf886ccd72b68b999ce16b768a85f74"} Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.235168 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-msrg6" event={"ID":"85db5c1e-a62f-496a-a8ce-0e32d4321ac9","Type":"ContainerStarted","Data":"4da9beaf34d627c5e62e3338c929d3474f116c6f6423d09a6ce9a6f36bcbd43f"} Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.236747 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-lnz72" event={"ID":"13de02b1-8017-4d32-b848-08d241ef34d4","Type":"ContainerStarted","Data":"8f68c1c7d9b4fa1ace81c369fe935c85ab7372e5fbe4a82258e2e5e064e609c6"} Feb 24 03:10:11 crc kubenswrapper[4923]: W0224 03:10:11.241381 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b9a0e9e_0ef9_4b69_87f3_63cfb4204996.slice/crio-385c90d853c0ef46c2339bdff727d9233b15defdad9f9b0398d91d4d787cffde WatchSource:0}: Error finding container 385c90d853c0ef46c2339bdff727d9233b15defdad9f9b0398d91d4d787cffde: Status 404 returned error can't find the container with id 385c90d853c0ef46c2339bdff727d9233b15defdad9f9b0398d91d4d787cffde Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.242110 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-2zxl4"] Feb 24 03:10:11 crc kubenswrapper[4923]: W0224 03:10:11.244253 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7228fc47_38cb_4680_9104_d5657a853147.slice/crio-3f20260eba6b3455cedc066028a6e491035dd2665bb94ed9bd9d42a83e589a0f WatchSource:0}: Error finding container 3f20260eba6b3455cedc066028a6e491035dd2665bb94ed9bd9d42a83e589a0f: Status 404 returned error can't find the container with id 3f20260eba6b3455cedc066028a6e491035dd2665bb94ed9bd9d42a83e589a0f Feb 24 03:10:11 crc kubenswrapper[4923]: W0224 03:10:11.244970 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod063cbd60_dc19_4c47_96ca_7b9cb24bf2ef.slice/crio-783afee2d8859dddbfb505db42d0aa9e2a2f8e519e9b6fde43529b05b697bccc WatchSource:0}: Error finding container 783afee2d8859dddbfb505db42d0aa9e2a2f8e519e9b6fde43529b05b697bccc: Status 404 returned error can't find the container with id 783afee2d8859dddbfb505db42d0aa9e2a2f8e519e9b6fde43529b05b697bccc Feb 24 03:10:11 crc kubenswrapper[4923]: W0224 03:10:11.251335 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31f694c1_3948_4e87_90d1_5bd1e7d0aef6.slice/crio-76b96ef2f7a93548a98eb97b787703b4db30dccc71ed05c5c3da61e2fc077ea6 WatchSource:0}: Error finding container 76b96ef2f7a93548a98eb97b787703b4db30dccc71ed05c5c3da61e2fc077ea6: Status 404 returned error can't find the container with id 76b96ef2f7a93548a98eb97b787703b4db30dccc71ed05c5c3da61e2fc077ea6 Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.252818 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-c8n7r"] Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.262722 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-r64q5"] Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.268709 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-4d7n7"] Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.320978 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-f4b5c"] Feb 24 03:10:11 crc kubenswrapper[4923]: W0224 03:10:11.331656 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4981f2a3_3977_40b7_819b_59cf400fa882.slice/crio-7beae9d570c28f97f5aec7ab7e93c556b4f6ec57fbd4a1a9bd24405e97719bf8 WatchSource:0}: Error finding container 7beae9d570c28f97f5aec7ab7e93c556b4f6ec57fbd4a1a9bd24405e97719bf8: Status 404 returned error can't find the container with id 7beae9d570c28f97f5aec7ab7e93c556b4f6ec57fbd4a1a9bd24405e97719bf8 Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.340399 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-zxhc7"] Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.343190 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert\") pod \"infra-operator-controller-manager-79d975b745-bmn7v\" (UID: \"26fc13d5-b98a-49ac-8c62-ee3ab08a9767\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.346289 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-m8tjf"] Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.346385 4923 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.346465 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert podName:26fc13d5-b98a-49ac-8c62-ee3ab08a9767 nodeName:}" failed. No retries permitted until 2026-02-24 03:10:13.346442841 +0000 UTC m=+937.363513664 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert") pod "infra-operator-controller-manager-79d975b745-bmn7v" (UID: "26fc13d5-b98a-49ac-8c62-ee3ab08a9767") : secret "infra-operator-webhook-server-cert" not found Feb 24 03:10:11 crc kubenswrapper[4923]: W0224 03:10:11.351554 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod400c9c7a_a90c_4b16_b13d_25c26be22f93.slice/crio-6152c72b95ad4c6c2587dbf957082206b4bf12cfd3ec8d51ed2c2b0cb5a5808b WatchSource:0}: Error finding container 6152c72b95ad4c6c2587dbf957082206b4bf12cfd3ec8d51ed2c2b0cb5a5808b: Status 404 returned error can't find the container with id 6152c72b95ad4c6c2587dbf957082206b4bf12cfd3ec8d51ed2c2b0cb5a5808b Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.358281 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-4c6hc"] Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.365581 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-x6vmb"] Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.368368 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v454w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-m8tjf_openstack-operators(d368aeb3-8ad3-4ed2-8022-6dfb08b2f86e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.371510 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-m8tjf" podUID="d368aeb3-8ad3-4ed2-8022-6dfb08b2f86e" Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.378261 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fn5vd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-zxhc7_openstack-operators(a9542edd-79bc-4eb0-8afb-7e9c61a8fb6e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.379011 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-96qjq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-4c6hc_openstack-operators(400c9c7a-a90c-4b16-b13d-25c26be22f93): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 24 03:10:11 crc kubenswrapper[4923]: W0224 03:10:11.379470 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod039de08e_513c_47e3_a3f7_59b8911b7dae.slice/crio-d4fe634827727346c03794b897d5312d9184ce29249beeb85d974ad177cbcaad WatchSource:0}: Error finding container d4fe634827727346c03794b897d5312d9184ce29249beeb85d974ad177cbcaad: Status 404 returned error can't find the container with id d4fe634827727346c03794b897d5312d9184ce29249beeb85d974ad177cbcaad Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.379473 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-zxhc7" podUID="a9542edd-79bc-4eb0-8afb-7e9c61a8fb6e" Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.380384 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4c6hc" podUID="400c9c7a-a90c-4b16-b13d-25c26be22f93" Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.381980 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mz6kl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6bd4687957-x6vmb_openstack-operators(039de08e-513c-47e3-a3f7-59b8911b7dae): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.385358 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-x6vmb" podUID="039de08e-513c-47e3-a3f7-59b8911b7dae" Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.470803 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-2bgtt"] Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.482169 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vczgc"] Feb 24 03:10:11 crc kubenswrapper[4923]: W0224 03:10:11.483867 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae81debf_3361_424d_afbe_9e4521997d23.slice/crio-0bda9a1e60c26c6b878ef7afcebd21ab583f5502c1c5286fded758d46fcfb12d WatchSource:0}: Error finding container 0bda9a1e60c26c6b878ef7afcebd21ab583f5502c1c5286fded758d46fcfb12d: Status 404 returned error can't find the container with id 0bda9a1e60c26c6b878ef7afcebd21ab583f5502c1c5286fded758d46fcfb12d Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.509452 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-5pzgk"] Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.513724 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wcztp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5dc6794d5b-5pzgk_openstack-operators(4ba7a21e-9aef-4596-9e18-21c66394cf74): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.514876 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-5pzgk" podUID="4ba7a21e-9aef-4596-9e18-21c66394cf74" Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.518307 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nz29d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-vczgc_openstack-operators(7e46cf81-12eb-4c37-9f04-affbd9f153b7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.519495 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vczgc" podUID="7e46cf81-12eb-4c37-9f04-affbd9f153b7" Feb 24 03:10:11 crc kubenswrapper[4923]: I0224 03:10:11.647537 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw\" (UID: \"8c5a7840-9e6b-4442-b99e-1ce50bff0722\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.647743 4923 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 24 03:10:11 crc kubenswrapper[4923]: E0224 03:10:11.647959 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert podName:8c5a7840-9e6b-4442-b99e-1ce50bff0722 nodeName:}" failed. No retries permitted until 2026-02-24 03:10:13.647863921 +0000 UTC m=+937.664934734 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" (UID: "8c5a7840-9e6b-4442-b99e-1ce50bff0722") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 24 03:10:12 crc kubenswrapper[4923]: I0224 03:10:12.052620 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:12 crc kubenswrapper[4923]: I0224 03:10:12.053024 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:12 crc kubenswrapper[4923]: E0224 03:10:12.052832 4923 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 24 03:10:12 crc kubenswrapper[4923]: E0224 03:10:12.053176 4923 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 24 03:10:12 crc kubenswrapper[4923]: E0224 03:10:12.053231 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs podName:6b2c8692-7382-4716-8770-47d21209898f nodeName:}" failed. No retries permitted until 2026-02-24 03:10:14.053208767 +0000 UTC m=+938.070279620 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs") pod "openstack-operator-controller-manager-5dd698895-qrccn" (UID: "6b2c8692-7382-4716-8770-47d21209898f") : secret "webhook-server-cert" not found Feb 24 03:10:12 crc kubenswrapper[4923]: E0224 03:10:12.053251 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs podName:6b2c8692-7382-4716-8770-47d21209898f nodeName:}" failed. No retries permitted until 2026-02-24 03:10:14.053244208 +0000 UTC m=+938.070315091 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs") pod "openstack-operator-controller-manager-5dd698895-qrccn" (UID: "6b2c8692-7382-4716-8770-47d21209898f") : secret "metrics-server-cert" not found Feb 24 03:10:12 crc kubenswrapper[4923]: I0224 03:10:12.255869 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4d7n7" event={"ID":"7228fc47-38cb-4680-9104-d5657a853147","Type":"ContainerStarted","Data":"3f20260eba6b3455cedc066028a6e491035dd2665bb94ed9bd9d42a83e589a0f"} Feb 24 03:10:12 crc kubenswrapper[4923]: I0224 03:10:12.259462 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-c8n7r" event={"ID":"063cbd60-dc19-4c47-96ca-7b9cb24bf2ef","Type":"ContainerStarted","Data":"783afee2d8859dddbfb505db42d0aa9e2a2f8e519e9b6fde43529b05b697bccc"} Feb 24 03:10:12 crc kubenswrapper[4923]: I0224 03:10:12.262060 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4c6hc" event={"ID":"400c9c7a-a90c-4b16-b13d-25c26be22f93","Type":"ContainerStarted","Data":"6152c72b95ad4c6c2587dbf957082206b4bf12cfd3ec8d51ed2c2b0cb5a5808b"} Feb 24 03:10:12 crc kubenswrapper[4923]: E0224 03:10:12.265451 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4c6hc" podUID="400c9c7a-a90c-4b16-b13d-25c26be22f93" Feb 24 03:10:12 crc kubenswrapper[4923]: I0224 03:10:12.266055 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f4b5c" event={"ID":"4981f2a3-3977-40b7-819b-59cf400fa882","Type":"ContainerStarted","Data":"7beae9d570c28f97f5aec7ab7e93c556b4f6ec57fbd4a1a9bd24405e97719bf8"} Feb 24 03:10:12 crc kubenswrapper[4923]: I0224 03:10:12.281731 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-2bgtt" event={"ID":"ae81debf-3361-424d-afbe-9e4521997d23","Type":"ContainerStarted","Data":"0bda9a1e60c26c6b878ef7afcebd21ab583f5502c1c5286fded758d46fcfb12d"} Feb 24 03:10:12 crc kubenswrapper[4923]: I0224 03:10:12.288908 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2zxl4" event={"ID":"8b9a0e9e-0ef9-4b69-87f3-63cfb4204996","Type":"ContainerStarted","Data":"385c90d853c0ef46c2339bdff727d9233b15defdad9f9b0398d91d4d787cffde"} Feb 24 03:10:12 crc kubenswrapper[4923]: I0224 03:10:12.296272 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-zxhc7" event={"ID":"a9542edd-79bc-4eb0-8afb-7e9c61a8fb6e","Type":"ContainerStarted","Data":"d1096e44c15041f8370fb756f68cff5188116cd32c46a6370a0881173be9f3fe"} Feb 24 03:10:12 crc kubenswrapper[4923]: E0224 03:10:12.299654 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-zxhc7" podUID="a9542edd-79bc-4eb0-8afb-7e9c61a8fb6e" Feb 24 03:10:12 crc kubenswrapper[4923]: I0224 03:10:12.300353 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vczgc" event={"ID":"7e46cf81-12eb-4c37-9f04-affbd9f153b7","Type":"ContainerStarted","Data":"289d2ed9ffaecb6340bc90e1da788e63b568a1d66fd92beba471f362fa773484"} Feb 24 03:10:12 crc kubenswrapper[4923]: E0224 03:10:12.302111 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vczgc" podUID="7e46cf81-12eb-4c37-9f04-affbd9f153b7" Feb 24 03:10:12 crc kubenswrapper[4923]: I0224 03:10:12.317251 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-m8tjf" event={"ID":"d368aeb3-8ad3-4ed2-8022-6dfb08b2f86e","Type":"ContainerStarted","Data":"e5ccd5b53f008b335e42960197f5812d63fd0b90ea5ab945ff313a7563bb7014"} Feb 24 03:10:12 crc kubenswrapper[4923]: E0224 03:10:12.319232 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-m8tjf" podUID="d368aeb3-8ad3-4ed2-8022-6dfb08b2f86e" Feb 24 03:10:12 crc kubenswrapper[4923]: I0224 03:10:12.319835 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-5pzgk" event={"ID":"4ba7a21e-9aef-4596-9e18-21c66394cf74","Type":"ContainerStarted","Data":"70257ad1b9abb5262b85bce41044c394d0d5aa7c3adfe4f280ed6c7877befc00"} Feb 24 03:10:12 crc kubenswrapper[4923]: E0224 03:10:12.321470 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-5pzgk" podUID="4ba7a21e-9aef-4596-9e18-21c66394cf74" Feb 24 03:10:12 crc kubenswrapper[4923]: I0224 03:10:12.328154 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-r64q5" event={"ID":"31f694c1-3948-4e87-90d1-5bd1e7d0aef6","Type":"ContainerStarted","Data":"76b96ef2f7a93548a98eb97b787703b4db30dccc71ed05c5c3da61e2fc077ea6"} Feb 24 03:10:12 crc kubenswrapper[4923]: I0224 03:10:12.334365 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-x6vmb" event={"ID":"039de08e-513c-47e3-a3f7-59b8911b7dae","Type":"ContainerStarted","Data":"d4fe634827727346c03794b897d5312d9184ce29249beeb85d974ad177cbcaad"} Feb 24 03:10:12 crc kubenswrapper[4923]: E0224 03:10:12.340438 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-x6vmb" podUID="039de08e-513c-47e3-a3f7-59b8911b7dae" Feb 24 03:10:13 crc kubenswrapper[4923]: E0224 03:10:13.348815 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-zxhc7" podUID="a9542edd-79bc-4eb0-8afb-7e9c61a8fb6e" Feb 24 03:10:13 crc kubenswrapper[4923]: E0224 03:10:13.349121 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4c6hc" podUID="400c9c7a-a90c-4b16-b13d-25c26be22f93" Feb 24 03:10:13 crc kubenswrapper[4923]: E0224 03:10:13.349159 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-5pzgk" podUID="4ba7a21e-9aef-4596-9e18-21c66394cf74" Feb 24 03:10:13 crc kubenswrapper[4923]: E0224 03:10:13.349193 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vczgc" podUID="7e46cf81-12eb-4c37-9f04-affbd9f153b7" Feb 24 03:10:13 crc kubenswrapper[4923]: E0224 03:10:13.349229 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-m8tjf" podUID="d368aeb3-8ad3-4ed2-8022-6dfb08b2f86e" Feb 24 03:10:13 crc kubenswrapper[4923]: E0224 03:10:13.349471 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-x6vmb" podUID="039de08e-513c-47e3-a3f7-59b8911b7dae" Feb 24 03:10:13 crc kubenswrapper[4923]: I0224 03:10:13.381881 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert\") pod \"infra-operator-controller-manager-79d975b745-bmn7v\" (UID: \"26fc13d5-b98a-49ac-8c62-ee3ab08a9767\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" Feb 24 03:10:13 crc kubenswrapper[4923]: E0224 03:10:13.382077 4923 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 24 03:10:13 crc kubenswrapper[4923]: E0224 03:10:13.382150 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert podName:26fc13d5-b98a-49ac-8c62-ee3ab08a9767 nodeName:}" failed. No retries permitted until 2026-02-24 03:10:17.382119875 +0000 UTC m=+941.399190688 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert") pod "infra-operator-controller-manager-79d975b745-bmn7v" (UID: "26fc13d5-b98a-49ac-8c62-ee3ab08a9767") : secret "infra-operator-webhook-server-cert" not found Feb 24 03:10:13 crc kubenswrapper[4923]: I0224 03:10:13.686898 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw\" (UID: \"8c5a7840-9e6b-4442-b99e-1ce50bff0722\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" Feb 24 03:10:13 crc kubenswrapper[4923]: E0224 03:10:13.687057 4923 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 24 03:10:13 crc kubenswrapper[4923]: E0224 03:10:13.687225 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert podName:8c5a7840-9e6b-4442-b99e-1ce50bff0722 nodeName:}" failed. No retries permitted until 2026-02-24 03:10:17.687156121 +0000 UTC m=+941.704226934 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" (UID: "8c5a7840-9e6b-4442-b99e-1ce50bff0722") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 24 03:10:14 crc kubenswrapper[4923]: I0224 03:10:14.093324 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:14 crc kubenswrapper[4923]: I0224 03:10:14.093668 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:14 crc kubenswrapper[4923]: E0224 03:10:14.093526 4923 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 24 03:10:14 crc kubenswrapper[4923]: E0224 03:10:14.093752 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs podName:6b2c8692-7382-4716-8770-47d21209898f nodeName:}" failed. No retries permitted until 2026-02-24 03:10:18.093735509 +0000 UTC m=+942.110806312 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs") pod "openstack-operator-controller-manager-5dd698895-qrccn" (UID: "6b2c8692-7382-4716-8770-47d21209898f") : secret "webhook-server-cert" not found Feb 24 03:10:14 crc kubenswrapper[4923]: E0224 03:10:14.093843 4923 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 24 03:10:14 crc kubenswrapper[4923]: E0224 03:10:14.093955 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs podName:6b2c8692-7382-4716-8770-47d21209898f nodeName:}" failed. No retries permitted until 2026-02-24 03:10:18.093925234 +0000 UTC m=+942.110996107 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs") pod "openstack-operator-controller-manager-5dd698895-qrccn" (UID: "6b2c8692-7382-4716-8770-47d21209898f") : secret "metrics-server-cert" not found Feb 24 03:10:17 crc kubenswrapper[4923]: I0224 03:10:17.440005 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert\") pod \"infra-operator-controller-manager-79d975b745-bmn7v\" (UID: \"26fc13d5-b98a-49ac-8c62-ee3ab08a9767\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" Feb 24 03:10:17 crc kubenswrapper[4923]: E0224 03:10:17.440241 4923 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 24 03:10:17 crc kubenswrapper[4923]: E0224 03:10:17.440286 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert podName:26fc13d5-b98a-49ac-8c62-ee3ab08a9767 nodeName:}" failed. No retries permitted until 2026-02-24 03:10:25.440274054 +0000 UTC m=+949.457344867 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert") pod "infra-operator-controller-manager-79d975b745-bmn7v" (UID: "26fc13d5-b98a-49ac-8c62-ee3ab08a9767") : secret "infra-operator-webhook-server-cert" not found Feb 24 03:10:17 crc kubenswrapper[4923]: I0224 03:10:17.744396 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw\" (UID: \"8c5a7840-9e6b-4442-b99e-1ce50bff0722\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" Feb 24 03:10:17 crc kubenswrapper[4923]: E0224 03:10:17.744614 4923 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 24 03:10:17 crc kubenswrapper[4923]: E0224 03:10:17.744700 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert podName:8c5a7840-9e6b-4442-b99e-1ce50bff0722 nodeName:}" failed. No retries permitted until 2026-02-24 03:10:25.744675843 +0000 UTC m=+949.761746676 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" (UID: "8c5a7840-9e6b-4442-b99e-1ce50bff0722") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 24 03:10:18 crc kubenswrapper[4923]: I0224 03:10:18.149915 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:18 crc kubenswrapper[4923]: I0224 03:10:18.150317 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:18 crc kubenswrapper[4923]: E0224 03:10:18.150105 4923 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 24 03:10:18 crc kubenswrapper[4923]: E0224 03:10:18.150560 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs podName:6b2c8692-7382-4716-8770-47d21209898f nodeName:}" failed. No retries permitted until 2026-02-24 03:10:26.150542381 +0000 UTC m=+950.167613194 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs") pod "openstack-operator-controller-manager-5dd698895-qrccn" (UID: "6b2c8692-7382-4716-8770-47d21209898f") : secret "webhook-server-cert" not found Feb 24 03:10:18 crc kubenswrapper[4923]: E0224 03:10:18.150372 4923 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 24 03:10:18 crc kubenswrapper[4923]: E0224 03:10:18.150661 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs podName:6b2c8692-7382-4716-8770-47d21209898f nodeName:}" failed. No retries permitted until 2026-02-24 03:10:26.150635714 +0000 UTC m=+950.167706567 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs") pod "openstack-operator-controller-manager-5dd698895-qrccn" (UID: "6b2c8692-7382-4716-8770-47d21209898f") : secret "metrics-server-cert" not found Feb 24 03:10:19 crc kubenswrapper[4923]: I0224 03:10:19.916025 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:10:19 crc kubenswrapper[4923]: I0224 03:10:19.916401 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:10:19 crc kubenswrapper[4923]: I0224 03:10:19.916448 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 03:10:19 crc kubenswrapper[4923]: I0224 03:10:19.917285 4923 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"369652d4e2fcdce7839d154f1d90c85b55a365ec3b7c320fea7e81e6fe472c3d"} pod="openshift-machine-config-operator/machine-config-daemon-rh26t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 03:10:19 crc kubenswrapper[4923]: I0224 03:10:19.917392 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" containerID="cri-o://369652d4e2fcdce7839d154f1d90c85b55a365ec3b7c320fea7e81e6fe472c3d" gracePeriod=600 Feb 24 03:10:20 crc kubenswrapper[4923]: I0224 03:10:20.394616 4923 generic.go:334] "Generic (PLEG): container finished" podID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerID="369652d4e2fcdce7839d154f1d90c85b55a365ec3b7c320fea7e81e6fe472c3d" exitCode=0 Feb 24 03:10:20 crc kubenswrapper[4923]: I0224 03:10:20.394660 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerDied","Data":"369652d4e2fcdce7839d154f1d90c85b55a365ec3b7c320fea7e81e6fe472c3d"} Feb 24 03:10:20 crc kubenswrapper[4923]: I0224 03:10:20.394694 4923 scope.go:117] "RemoveContainer" containerID="dd9566916d0707d0d74e9c129e61b45fcf01bae77c86a9c663fcebb809b372a3" Feb 24 03:10:23 crc kubenswrapper[4923]: E0224 03:10:23.940449 4923 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:4eb8fab5530a08915d3ab3e11e2808aeae16c8a220ed34ee04a186b2ae2303dc" Feb 24 03:10:23 crc kubenswrapper[4923]: E0224 03:10:23.941118 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:4eb8fab5530a08915d3ab3e11e2808aeae16c8a220ed34ee04a186b2ae2303dc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pn5gn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-589c568786-f4b5c_openstack-operators(4981f2a3-3977-40b7-819b-59cf400fa882): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 24 03:10:23 crc kubenswrapper[4923]: E0224 03:10:23.942346 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f4b5c" podUID="4981f2a3-3977-40b7-819b-59cf400fa882" Feb 24 03:10:24 crc kubenswrapper[4923]: E0224 03:10:24.431631 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4eb8fab5530a08915d3ab3e11e2808aeae16c8a220ed34ee04a186b2ae2303dc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f4b5c" podUID="4981f2a3-3977-40b7-819b-59cf400fa882" Feb 24 03:10:24 crc kubenswrapper[4923]: E0224 03:10:24.483879 4923 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 24 03:10:24 crc kubenswrapper[4923]: E0224 03:10:24.484150 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qc6qf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-tqvjv_openstack-operators(6a78564f-37a8-4385-9f93-57ee3952d36c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 24 03:10:24 crc kubenswrapper[4923]: E0224 03:10:24.485357 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tqvjv" podUID="6a78564f-37a8-4385-9f93-57ee3952d36c" Feb 24 03:10:25 crc kubenswrapper[4923]: E0224 03:10:25.259874 4923 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 24 03:10:25 crc kubenswrapper[4923]: E0224 03:10:25.260431 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ln2jm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-c8n7r_openstack-operators(063cbd60-dc19-4c47-96ca-7b9cb24bf2ef): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 24 03:10:25 crc kubenswrapper[4923]: E0224 03:10:25.262232 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-c8n7r" podUID="063cbd60-dc19-4c47-96ca-7b9cb24bf2ef" Feb 24 03:10:25 crc kubenswrapper[4923]: E0224 03:10:25.445331 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-c8n7r" podUID="063cbd60-dc19-4c47-96ca-7b9cb24bf2ef" Feb 24 03:10:25 crc kubenswrapper[4923]: E0224 03:10:25.445583 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tqvjv" podUID="6a78564f-37a8-4385-9f93-57ee3952d36c" Feb 24 03:10:25 crc kubenswrapper[4923]: I0224 03:10:25.513281 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert\") pod \"infra-operator-controller-manager-79d975b745-bmn7v\" (UID: \"26fc13d5-b98a-49ac-8c62-ee3ab08a9767\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" Feb 24 03:10:25 crc kubenswrapper[4923]: E0224 03:10:25.513552 4923 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 24 03:10:25 crc kubenswrapper[4923]: E0224 03:10:25.513634 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert podName:26fc13d5-b98a-49ac-8c62-ee3ab08a9767 nodeName:}" failed. No retries permitted until 2026-02-24 03:10:41.513618166 +0000 UTC m=+965.530688979 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert") pod "infra-operator-controller-manager-79d975b745-bmn7v" (UID: "26fc13d5-b98a-49ac-8c62-ee3ab08a9767") : secret "infra-operator-webhook-server-cert" not found Feb 24 03:10:25 crc kubenswrapper[4923]: I0224 03:10:25.818530 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw\" (UID: \"8c5a7840-9e6b-4442-b99e-1ce50bff0722\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" Feb 24 03:10:25 crc kubenswrapper[4923]: E0224 03:10:25.818705 4923 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 24 03:10:25 crc kubenswrapper[4923]: E0224 03:10:25.818765 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert podName:8c5a7840-9e6b-4442-b99e-1ce50bff0722 nodeName:}" failed. No retries permitted until 2026-02-24 03:10:41.818747424 +0000 UTC m=+965.835818237 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" (UID: "8c5a7840-9e6b-4442-b99e-1ce50bff0722") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 24 03:10:26 crc kubenswrapper[4923]: I0224 03:10:26.232200 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:26 crc kubenswrapper[4923]: I0224 03:10:26.232256 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:26 crc kubenswrapper[4923]: E0224 03:10:26.232416 4923 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 24 03:10:26 crc kubenswrapper[4923]: E0224 03:10:26.232446 4923 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 24 03:10:26 crc kubenswrapper[4923]: E0224 03:10:26.232466 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs podName:6b2c8692-7382-4716-8770-47d21209898f nodeName:}" failed. No retries permitted until 2026-02-24 03:10:42.232451741 +0000 UTC m=+966.249522554 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs") pod "openstack-operator-controller-manager-5dd698895-qrccn" (UID: "6b2c8692-7382-4716-8770-47d21209898f") : secret "metrics-server-cert" not found Feb 24 03:10:26 crc kubenswrapper[4923]: E0224 03:10:26.232532 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs podName:6b2c8692-7382-4716-8770-47d21209898f nodeName:}" failed. No retries permitted until 2026-02-24 03:10:42.232515852 +0000 UTC m=+966.249586665 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs") pod "openstack-operator-controller-manager-5dd698895-qrccn" (UID: "6b2c8692-7382-4716-8770-47d21209898f") : secret "webhook-server-cert" not found Feb 24 03:10:27 crc kubenswrapper[4923]: I0224 03:10:27.460178 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-lnz72" event={"ID":"13de02b1-8017-4d32-b848-08d241ef34d4","Type":"ContainerStarted","Data":"0ed7e8c02f8c50b0c6324234062fc8d3f038c95677345b989c2ac021bfc43464"} Feb 24 03:10:27 crc kubenswrapper[4923]: I0224 03:10:27.460541 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-lnz72" Feb 24 03:10:27 crc kubenswrapper[4923]: I0224 03:10:27.464263 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-n2qf9" event={"ID":"74dbbb69-5b9b-45b1-a74e-8bed20a6cbed","Type":"ContainerStarted","Data":"c199718d0b07868a2d3f9a9331ef93d50e3ba156157ed67648a67ffe1caf4691"} Feb 24 03:10:27 crc kubenswrapper[4923]: I0224 03:10:27.464710 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-n2qf9" Feb 24 03:10:27 crc kubenswrapper[4923]: I0224 03:10:27.466148 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2mdh9" event={"ID":"2dda58b4-8524-47cf-9e31-f276859d0af1","Type":"ContainerStarted","Data":"076ba12d7ac9e7f7565297455de7bcca4c559ec2a016c29bc0855d8a7e59dee6"} Feb 24 03:10:27 crc kubenswrapper[4923]: I0224 03:10:27.466410 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2mdh9" Feb 24 03:10:27 crc kubenswrapper[4923]: I0224 03:10:27.477254 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-lnz72" podStartSLOduration=4.119533809 podStartE2EDuration="18.477240647s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:10.86595838 +0000 UTC m=+934.883029193" lastFinishedPulling="2026-02-24 03:10:25.223665218 +0000 UTC m=+949.240736031" observedRunningTime="2026-02-24 03:10:27.477113183 +0000 UTC m=+951.494184026" watchObservedRunningTime="2026-02-24 03:10:27.477240647 +0000 UTC m=+951.494311460" Feb 24 03:10:27 crc kubenswrapper[4923]: I0224 03:10:27.495717 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2mdh9" podStartSLOduration=4.383337326 podStartE2EDuration="18.495697884s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:11.110858658 +0000 UTC m=+935.127929461" lastFinishedPulling="2026-02-24 03:10:25.223219206 +0000 UTC m=+949.240290019" observedRunningTime="2026-02-24 03:10:27.490196779 +0000 UTC m=+951.507267592" watchObservedRunningTime="2026-02-24 03:10:27.495697884 +0000 UTC m=+951.512768697" Feb 24 03:10:27 crc kubenswrapper[4923]: I0224 03:10:27.533720 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-n2qf9" podStartSLOduration=4.715315415 podStartE2EDuration="18.533683978s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:10.636333277 +0000 UTC m=+934.653404080" lastFinishedPulling="2026-02-24 03:10:24.45470183 +0000 UTC m=+948.471772643" observedRunningTime="2026-02-24 03:10:27.530280318 +0000 UTC m=+951.547351131" watchObservedRunningTime="2026-02-24 03:10:27.533683978 +0000 UTC m=+951.550754801" Feb 24 03:10:28 crc kubenswrapper[4923]: I0224 03:10:28.489421 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerStarted","Data":"40cb3d82b93cff9bd3bf829c2417332644f1c7038c262573b0f2c1eba50e9cc2"} Feb 24 03:10:28 crc kubenswrapper[4923]: I0224 03:10:28.493475 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-2bgtt" event={"ID":"ae81debf-3361-424d-afbe-9e4521997d23","Type":"ContainerStarted","Data":"6780c783334625d28d8969f8c1b899ed4a06c3a2a97864f291195be977975824"} Feb 24 03:10:28 crc kubenswrapper[4923]: I0224 03:10:28.493786 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-2bgtt" Feb 24 03:10:28 crc kubenswrapper[4923]: I0224 03:10:28.502641 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2zxl4" event={"ID":"8b9a0e9e-0ef9-4b69-87f3-63cfb4204996","Type":"ContainerStarted","Data":"14ecc9ef251184b4df3b7aabaafc62a8ec233f2d420fa71bfdc621e06f92f3f8"} Feb 24 03:10:28 crc kubenswrapper[4923]: I0224 03:10:28.502711 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2zxl4" Feb 24 03:10:28 crc kubenswrapper[4923]: I0224 03:10:28.506007 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-msrg6" event={"ID":"85db5c1e-a62f-496a-a8ce-0e32d4321ac9","Type":"ContainerStarted","Data":"3b0c62eefd175e85fb785fda03ca6efc089fcbb16a9b9f3936f490a194597750"} Feb 24 03:10:28 crc kubenswrapper[4923]: I0224 03:10:28.506033 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-msrg6" Feb 24 03:10:28 crc kubenswrapper[4923]: I0224 03:10:28.519453 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-2bgtt" podStartSLOduration=4.796395857 podStartE2EDuration="18.519438762s" podCreationTimestamp="2026-02-24 03:10:10 +0000 UTC" firstStartedPulling="2026-02-24 03:10:11.500486429 +0000 UTC m=+935.517557262" lastFinishedPulling="2026-02-24 03:10:25.223529334 +0000 UTC m=+949.240600167" observedRunningTime="2026-02-24 03:10:28.512883319 +0000 UTC m=+952.529954122" watchObservedRunningTime="2026-02-24 03:10:28.519438762 +0000 UTC m=+952.536509575" Feb 24 03:10:28 crc kubenswrapper[4923]: I0224 03:10:28.532693 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2zxl4" podStartSLOduration=4.215548205 podStartE2EDuration="19.532676612s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:11.243891752 +0000 UTC m=+935.260962565" lastFinishedPulling="2026-02-24 03:10:26.561020159 +0000 UTC m=+950.578090972" observedRunningTime="2026-02-24 03:10:28.530536685 +0000 UTC m=+952.547607518" watchObservedRunningTime="2026-02-24 03:10:28.532676612 +0000 UTC m=+952.549747415" Feb 24 03:10:28 crc kubenswrapper[4923]: I0224 03:10:28.553979 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-msrg6" podStartSLOduration=5.206878817 podStartE2EDuration="19.553963424s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:10.876128599 +0000 UTC m=+934.893199402" lastFinishedPulling="2026-02-24 03:10:25.223213196 +0000 UTC m=+949.240284009" observedRunningTime="2026-02-24 03:10:28.552860935 +0000 UTC m=+952.569931748" watchObservedRunningTime="2026-02-24 03:10:28.553963424 +0000 UTC m=+952.571034227" Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.526841 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-zxhc7" event={"ID":"a9542edd-79bc-4eb0-8afb-7e9c61a8fb6e","Type":"ContainerStarted","Data":"318c5ae3c0f49751883ac07e0c809c66c0eba72e520793339937ea8f8c05b5ef"} Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.527399 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-zxhc7" Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.530646 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-5pzgk" event={"ID":"4ba7a21e-9aef-4596-9e18-21c66394cf74","Type":"ContainerStarted","Data":"73a559203a331ec5f2ec0e8bbe0573196273db47a190abbf4ad8c0a27f46114d"} Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.530906 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-5pzgk" Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.532390 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-r64q5" event={"ID":"31f694c1-3948-4e87-90d1-5bd1e7d0aef6","Type":"ContainerStarted","Data":"62be3a994c45e69523d147cf6d37d309471b01a622ee75847d99005c812d8b65"} Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.532471 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-r64q5" Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.543012 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4d7n7" event={"ID":"7228fc47-38cb-4680-9104-d5657a853147","Type":"ContainerStarted","Data":"c1a1c9a9303fb825733b32a3cd9be0f29b678c7d78c967851bb9307bba9811cc"} Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.543149 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4d7n7" Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.547907 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-m8tjf" event={"ID":"d368aeb3-8ad3-4ed2-8022-6dfb08b2f86e","Type":"ContainerStarted","Data":"6e578d99f3ec5c67a0faf2b6abd930b6a5dd9e8be543870f58674902baca7823"} Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.548630 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-m8tjf" Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.552244 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-zxhc7" podStartSLOduration=4.685700301 podStartE2EDuration="20.552229468s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:11.378088986 +0000 UTC m=+935.395159799" lastFinishedPulling="2026-02-24 03:10:27.244618153 +0000 UTC m=+951.261688966" observedRunningTime="2026-02-24 03:10:29.547975716 +0000 UTC m=+953.565046529" watchObservedRunningTime="2026-02-24 03:10:29.552229468 +0000 UTC m=+953.569300281" Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.552589 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-pd2zn" event={"ID":"1dc2eca6-9aa4-4b34-a30c-bb88a78fe1d1","Type":"ContainerStarted","Data":"b0b62ba4149c61327eb3433354b64e111dba4b18caab054ae066fe26e901d3c2"} Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.553186 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-pd2zn" Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.559529 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-58z87" event={"ID":"202e32ae-6025-43c3-90ee-d5a6ec2f7752","Type":"ContainerStarted","Data":"0d0b3891a4931bd4f5955c8ef122e7a1a401d4ff0656e21f243bb5f63dce14a4"} Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.560200 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-58z87" Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.568206 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-x6vmb" event={"ID":"039de08e-513c-47e3-a3f7-59b8911b7dae","Type":"ContainerStarted","Data":"89fa907ea16f7bbe26005d2ec86fd77154b5a95388f2f98ba567b5f35b069a3c"} Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.568750 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-x6vmb" Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.571198 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-tbhxh" event={"ID":"07169ece-c03c-464a-9899-f03b61426df5","Type":"ContainerStarted","Data":"faa2a45445d7e81a42e2b11c1427d7a392128378896b83f5408ad345233e449d"} Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.571226 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-tbhxh" Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.590977 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4d7n7" podStartSLOduration=6.613212378 podStartE2EDuration="20.590958881s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:11.246331216 +0000 UTC m=+935.263402029" lastFinishedPulling="2026-02-24 03:10:25.224077719 +0000 UTC m=+949.241148532" observedRunningTime="2026-02-24 03:10:29.574054685 +0000 UTC m=+953.591125498" watchObservedRunningTime="2026-02-24 03:10:29.590958881 +0000 UTC m=+953.608029694" Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.600407 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-r64q5" podStartSLOduration=5.834420791 podStartE2EDuration="20.60038635s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:11.256483355 +0000 UTC m=+935.273554168" lastFinishedPulling="2026-02-24 03:10:26.022448914 +0000 UTC m=+950.039519727" observedRunningTime="2026-02-24 03:10:29.597407951 +0000 UTC m=+953.614478764" watchObservedRunningTime="2026-02-24 03:10:29.60038635 +0000 UTC m=+953.617457163" Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.620265 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-5pzgk" podStartSLOduration=3.836810443 podStartE2EDuration="19.620249035s" podCreationTimestamp="2026-02-24 03:10:10 +0000 UTC" firstStartedPulling="2026-02-24 03:10:11.513586365 +0000 UTC m=+935.530657178" lastFinishedPulling="2026-02-24 03:10:27.297024957 +0000 UTC m=+951.314095770" observedRunningTime="2026-02-24 03:10:29.615317624 +0000 UTC m=+953.632388437" watchObservedRunningTime="2026-02-24 03:10:29.620249035 +0000 UTC m=+953.637319848" Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.658483 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-m8tjf" podStartSLOduration=4.724157488 podStartE2EDuration="20.658467474s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:11.368217656 +0000 UTC m=+935.385288469" lastFinishedPulling="2026-02-24 03:10:27.302527642 +0000 UTC m=+951.319598455" observedRunningTime="2026-02-24 03:10:29.655558917 +0000 UTC m=+953.672629730" watchObservedRunningTime="2026-02-24 03:10:29.658467474 +0000 UTC m=+953.675538287" Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.680261 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-x6vmb" podStartSLOduration=3.805565896 podStartE2EDuration="20.680242479s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:11.381758593 +0000 UTC m=+935.398829406" lastFinishedPulling="2026-02-24 03:10:28.256435186 +0000 UTC m=+952.273505989" observedRunningTime="2026-02-24 03:10:29.677731473 +0000 UTC m=+953.694802286" watchObservedRunningTime="2026-02-24 03:10:29.680242479 +0000 UTC m=+953.697313282" Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.737629 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-58z87" podStartSLOduration=5.776734297 podStartE2EDuration="20.737609204s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:11.061871795 +0000 UTC m=+935.078942608" lastFinishedPulling="2026-02-24 03:10:26.022746702 +0000 UTC m=+950.039817515" observedRunningTime="2026-02-24 03:10:29.704914271 +0000 UTC m=+953.721985104" watchObservedRunningTime="2026-02-24 03:10:29.737609204 +0000 UTC m=+953.754680017" Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.764437 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-pd2zn" podStartSLOduration=6.605490334 podStartE2EDuration="20.764422892s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:11.065266694 +0000 UTC m=+935.082337507" lastFinishedPulling="2026-02-24 03:10:25.224199252 +0000 UTC m=+949.241270065" observedRunningTime="2026-02-24 03:10:29.744143587 +0000 UTC m=+953.761214400" watchObservedRunningTime="2026-02-24 03:10:29.764422892 +0000 UTC m=+953.781493705" Feb 24 03:10:29 crc kubenswrapper[4923]: I0224 03:10:29.766154 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-tbhxh" podStartSLOduration=4.626872928 podStartE2EDuration="20.766146858s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:11.061921366 +0000 UTC m=+935.078992179" lastFinishedPulling="2026-02-24 03:10:27.201195296 +0000 UTC m=+951.218266109" observedRunningTime="2026-02-24 03:10:29.762116032 +0000 UTC m=+953.779186845" watchObservedRunningTime="2026-02-24 03:10:29.766146858 +0000 UTC m=+953.783217671" Feb 24 03:10:33 crc kubenswrapper[4923]: I0224 03:10:33.612760 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4c6hc" event={"ID":"400c9c7a-a90c-4b16-b13d-25c26be22f93","Type":"ContainerStarted","Data":"d3a2bbc54a0c37c84a68924bb4347902f6565a02cde505866bef7ce748704927"} Feb 24 03:10:33 crc kubenswrapper[4923]: I0224 03:10:33.613494 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4c6hc" Feb 24 03:10:33 crc kubenswrapper[4923]: I0224 03:10:33.615502 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vczgc" event={"ID":"7e46cf81-12eb-4c37-9f04-affbd9f153b7","Type":"ContainerStarted","Data":"af124ac85abb9939cd30772a84681f9369d292f98657faaa654bb4ae504f0c4d"} Feb 24 03:10:33 crc kubenswrapper[4923]: I0224 03:10:33.645580 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vczgc" podStartSLOduration=2.1374862119999998 podStartE2EDuration="23.645558217s" podCreationTimestamp="2026-02-24 03:10:10 +0000 UTC" firstStartedPulling="2026-02-24 03:10:11.518185136 +0000 UTC m=+935.535255949" lastFinishedPulling="2026-02-24 03:10:33.026257141 +0000 UTC m=+957.043327954" observedRunningTime="2026-02-24 03:10:33.643366409 +0000 UTC m=+957.660437222" watchObservedRunningTime="2026-02-24 03:10:33.645558217 +0000 UTC m=+957.662629040" Feb 24 03:10:33 crc kubenswrapper[4923]: I0224 03:10:33.646021 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4c6hc" podStartSLOduration=3.026099441 podStartE2EDuration="24.646014959s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:11.378905748 +0000 UTC m=+935.395976561" lastFinishedPulling="2026-02-24 03:10:32.998821266 +0000 UTC m=+957.015892079" observedRunningTime="2026-02-24 03:10:33.630633643 +0000 UTC m=+957.647704456" watchObservedRunningTime="2026-02-24 03:10:33.646014959 +0000 UTC m=+957.663085782" Feb 24 03:10:36 crc kubenswrapper[4923]: I0224 03:10:36.641278 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f4b5c" event={"ID":"4981f2a3-3977-40b7-819b-59cf400fa882","Type":"ContainerStarted","Data":"a160e4933d69666f714dc005833a31ec8ee82bb1b151bc2ab015ac3f3e31b3b2"} Feb 24 03:10:36 crc kubenswrapper[4923]: I0224 03:10:36.642428 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f4b5c" Feb 24 03:10:36 crc kubenswrapper[4923]: I0224 03:10:36.668896 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f4b5c" podStartSLOduration=2.787310114 podStartE2EDuration="27.668876015s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:11.345560587 +0000 UTC m=+935.362631390" lastFinishedPulling="2026-02-24 03:10:36.227126478 +0000 UTC m=+960.244197291" observedRunningTime="2026-02-24 03:10:36.664850418 +0000 UTC m=+960.681921241" watchObservedRunningTime="2026-02-24 03:10:36.668876015 +0000 UTC m=+960.685946838" Feb 24 03:10:39 crc kubenswrapper[4923]: I0224 03:10:39.823585 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-msrg6" Feb 24 03:10:39 crc kubenswrapper[4923]: I0224 03:10:39.835850 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-lnz72" Feb 24 03:10:39 crc kubenswrapper[4923]: I0224 03:10:39.859524 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-n2qf9" Feb 24 03:10:39 crc kubenswrapper[4923]: I0224 03:10:39.878451 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-58z87" Feb 24 03:10:39 crc kubenswrapper[4923]: I0224 03:10:39.914968 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-tbhxh" Feb 24 03:10:39 crc kubenswrapper[4923]: I0224 03:10:39.953741 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2mdh9" Feb 24 03:10:40 crc kubenswrapper[4923]: I0224 03:10:40.124487 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-pd2zn" Feb 24 03:10:40 crc kubenswrapper[4923]: I0224 03:10:40.193164 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4d7n7" Feb 24 03:10:40 crc kubenswrapper[4923]: I0224 03:10:40.222028 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-4c6hc" Feb 24 03:10:40 crc kubenswrapper[4923]: I0224 03:10:40.280936 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-r64q5" Feb 24 03:10:40 crc kubenswrapper[4923]: I0224 03:10:40.303932 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-2zxl4" Feb 24 03:10:40 crc kubenswrapper[4923]: I0224 03:10:40.385584 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-m8tjf" Feb 24 03:10:40 crc kubenswrapper[4923]: I0224 03:10:40.508500 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-zxhc7" Feb 24 03:10:40 crc kubenswrapper[4923]: I0224 03:10:40.532985 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-x6vmb" Feb 24 03:10:40 crc kubenswrapper[4923]: I0224 03:10:40.586671 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-5pzgk" Feb 24 03:10:40 crc kubenswrapper[4923]: I0224 03:10:40.614950 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-2bgtt" Feb 24 03:10:41 crc kubenswrapper[4923]: I0224 03:10:41.575426 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert\") pod \"infra-operator-controller-manager-79d975b745-bmn7v\" (UID: \"26fc13d5-b98a-49ac-8c62-ee3ab08a9767\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" Feb 24 03:10:41 crc kubenswrapper[4923]: I0224 03:10:41.583938 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26fc13d5-b98a-49ac-8c62-ee3ab08a9767-cert\") pod \"infra-operator-controller-manager-79d975b745-bmn7v\" (UID: \"26fc13d5-b98a-49ac-8c62-ee3ab08a9767\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" Feb 24 03:10:41 crc kubenswrapper[4923]: I0224 03:10:41.833053 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-b9sx9" Feb 24 03:10:41 crc kubenswrapper[4923]: I0224 03:10:41.840579 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" Feb 24 03:10:41 crc kubenswrapper[4923]: I0224 03:10:41.880503 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw\" (UID: \"8c5a7840-9e6b-4442-b99e-1ce50bff0722\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" Feb 24 03:10:41 crc kubenswrapper[4923]: I0224 03:10:41.886995 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c5a7840-9e6b-4442-b99e-1ce50bff0722-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw\" (UID: \"8c5a7840-9e6b-4442-b99e-1ce50bff0722\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" Feb 24 03:10:42 crc kubenswrapper[4923]: I0224 03:10:42.083289 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-djsbj" Feb 24 03:10:42 crc kubenswrapper[4923]: I0224 03:10:42.092406 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" Feb 24 03:10:42 crc kubenswrapper[4923]: I0224 03:10:42.139268 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v"] Feb 24 03:10:42 crc kubenswrapper[4923]: I0224 03:10:42.291058 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:42 crc kubenswrapper[4923]: I0224 03:10:42.291102 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:42 crc kubenswrapper[4923]: I0224 03:10:42.295490 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-metrics-certs\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:42 crc kubenswrapper[4923]: I0224 03:10:42.295550 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6b2c8692-7382-4716-8770-47d21209898f-webhook-certs\") pod \"openstack-operator-controller-manager-5dd698895-qrccn\" (UID: \"6b2c8692-7382-4716-8770-47d21209898f\") " pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:42 crc kubenswrapper[4923]: I0224 03:10:42.522006 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-wr8bb" Feb 24 03:10:42 crc kubenswrapper[4923]: I0224 03:10:42.531503 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:42 crc kubenswrapper[4923]: I0224 03:10:42.569941 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw"] Feb 24 03:10:42 crc kubenswrapper[4923]: I0224 03:10:42.698789 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" event={"ID":"26fc13d5-b98a-49ac-8c62-ee3ab08a9767","Type":"ContainerStarted","Data":"c95add83d3babaedf804d0b501a48d106f37c801e929320cb2b1e01b6ade5baf"} Feb 24 03:10:42 crc kubenswrapper[4923]: I0224 03:10:42.702633 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" event={"ID":"8c5a7840-9e6b-4442-b99e-1ce50bff0722","Type":"ContainerStarted","Data":"ae66660895b439a61d7ba7c4d3104f85a6b91f8ed69650ee4dcf4a3b772cabeb"} Feb 24 03:10:43 crc kubenswrapper[4923]: I0224 03:10:43.001324 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn"] Feb 24 03:10:43 crc kubenswrapper[4923]: W0224 03:10:43.023520 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b2c8692_7382_4716_8770_47d21209898f.slice/crio-549815aa209690043f9c21fa0676b04254181bd446e019e34c2790c0069904d6 WatchSource:0}: Error finding container 549815aa209690043f9c21fa0676b04254181bd446e019e34c2790c0069904d6: Status 404 returned error can't find the container with id 549815aa209690043f9c21fa0676b04254181bd446e019e34c2790c0069904d6 Feb 24 03:10:43 crc kubenswrapper[4923]: I0224 03:10:43.710867 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" event={"ID":"6b2c8692-7382-4716-8770-47d21209898f","Type":"ContainerStarted","Data":"549815aa209690043f9c21fa0676b04254181bd446e019e34c2790c0069904d6"} Feb 24 03:10:48 crc kubenswrapper[4923]: I0224 03:10:48.743734 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" event={"ID":"6b2c8692-7382-4716-8770-47d21209898f","Type":"ContainerStarted","Data":"7655023ca864b06e989a46254b6fb3e803eb23a041a3593c6daaf1a90ecced73"} Feb 24 03:10:49 crc kubenswrapper[4923]: I0224 03:10:49.749002 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:10:49 crc kubenswrapper[4923]: I0224 03:10:49.786465 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" podStartSLOduration=39.786442764 podStartE2EDuration="39.786442764s" podCreationTimestamp="2026-02-24 03:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:10:49.777636673 +0000 UTC m=+973.794707506" watchObservedRunningTime="2026-02-24 03:10:49.786442764 +0000 UTC m=+973.803513617" Feb 24 03:10:50 crc kubenswrapper[4923]: I0224 03:10:50.428216 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-f4b5c" Feb 24 03:10:51 crc kubenswrapper[4923]: I0224 03:10:51.763213 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tqvjv" event={"ID":"6a78564f-37a8-4385-9f93-57ee3952d36c","Type":"ContainerStarted","Data":"a2e43285e57cc96c918370b7ec811da9f2e20fb3b6ebf3e52700fc3aed392596"} Feb 24 03:10:51 crc kubenswrapper[4923]: I0224 03:10:51.763725 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tqvjv" Feb 24 03:10:51 crc kubenswrapper[4923]: I0224 03:10:51.764532 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-c8n7r" event={"ID":"063cbd60-dc19-4c47-96ca-7b9cb24bf2ef","Type":"ContainerStarted","Data":"8597c2db0d6ebd74b29353957ae52c12dba0baf67cc2f9641c2efd99c765ba87"} Feb 24 03:10:51 crc kubenswrapper[4923]: I0224 03:10:51.764703 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-c8n7r" Feb 24 03:10:51 crc kubenswrapper[4923]: I0224 03:10:51.766041 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" event={"ID":"8c5a7840-9e6b-4442-b99e-1ce50bff0722","Type":"ContainerStarted","Data":"9ec20da14b8e173dd0064eb76a5ff5f8dcfa8ea8bf228469df6a284558116758"} Feb 24 03:10:51 crc kubenswrapper[4923]: I0224 03:10:51.766097 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" Feb 24 03:10:51 crc kubenswrapper[4923]: I0224 03:10:51.767584 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" event={"ID":"26fc13d5-b98a-49ac-8c62-ee3ab08a9767","Type":"ContainerStarted","Data":"124a3655bfcf066fdcd51b21c1bbb4145510bca7d251af85afbdfe2269991d7f"} Feb 24 03:10:51 crc kubenswrapper[4923]: I0224 03:10:51.767735 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" Feb 24 03:10:51 crc kubenswrapper[4923]: I0224 03:10:51.782660 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tqvjv" podStartSLOduration=3.131427604 podStartE2EDuration="42.782642085s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:11.106942785 +0000 UTC m=+935.124013598" lastFinishedPulling="2026-02-24 03:10:50.758157266 +0000 UTC m=+974.775228079" observedRunningTime="2026-02-24 03:10:51.779855081 +0000 UTC m=+975.796925914" watchObservedRunningTime="2026-02-24 03:10:51.782642085 +0000 UTC m=+975.799712908" Feb 24 03:10:51 crc kubenswrapper[4923]: I0224 03:10:51.800327 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-c8n7r" podStartSLOduration=3.289891214 podStartE2EDuration="42.800289407s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:11.248065342 +0000 UTC m=+935.265136155" lastFinishedPulling="2026-02-24 03:10:50.758463535 +0000 UTC m=+974.775534348" observedRunningTime="2026-02-24 03:10:51.795213444 +0000 UTC m=+975.812284257" watchObservedRunningTime="2026-02-24 03:10:51.800289407 +0000 UTC m=+975.817360231" Feb 24 03:10:51 crc kubenswrapper[4923]: I0224 03:10:51.820149 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" podStartSLOduration=34.147120747 podStartE2EDuration="42.820132368s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:42.159433595 +0000 UTC m=+966.176504408" lastFinishedPulling="2026-02-24 03:10:50.832445206 +0000 UTC m=+974.849516029" observedRunningTime="2026-02-24 03:10:51.816812051 +0000 UTC m=+975.833882874" watchObservedRunningTime="2026-02-24 03:10:51.820132368 +0000 UTC m=+975.837203221" Feb 24 03:10:51 crc kubenswrapper[4923]: I0224 03:10:51.848373 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" podStartSLOduration=34.618717344 podStartE2EDuration="42.848358419s" podCreationTimestamp="2026-02-24 03:10:09 +0000 UTC" firstStartedPulling="2026-02-24 03:10:42.57955125 +0000 UTC m=+966.596622083" lastFinishedPulling="2026-02-24 03:10:50.809192345 +0000 UTC m=+974.826263158" observedRunningTime="2026-02-24 03:10:51.843104071 +0000 UTC m=+975.860174894" watchObservedRunningTime="2026-02-24 03:10:51.848358419 +0000 UTC m=+975.865429242" Feb 24 03:11:00 crc kubenswrapper[4923]: I0224 03:11:00.109356 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-tqvjv" Feb 24 03:11:00 crc kubenswrapper[4923]: I0224 03:11:00.249615 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-c8n7r" Feb 24 03:11:01 crc kubenswrapper[4923]: I0224 03:11:01.850060 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-bmn7v" Feb 24 03:11:02 crc kubenswrapper[4923]: I0224 03:11:02.097789 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw" Feb 24 03:11:02 crc kubenswrapper[4923]: I0224 03:11:02.537927 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5dd698895-qrccn" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.198209 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jfgpx"] Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.200368 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jfgpx" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.203190 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.203276 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.203548 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.208986 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jfgpx"] Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.210410 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-bwfk8" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.263781 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-69rxm"] Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.266875 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-69rxm" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.272835 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.276126 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-69rxm"] Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.371603 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cb1d966-13e1-4ab1-8707-b06cf4f62036-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-69rxm\" (UID: \"5cb1d966-13e1-4ab1-8707-b06cf4f62036\") " pod="openstack/dnsmasq-dns-78dd6ddcc-69rxm" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.371651 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ljt8\" (UniqueName: \"kubernetes.io/projected/5cb1d966-13e1-4ab1-8707-b06cf4f62036-kube-api-access-6ljt8\") pod \"dnsmasq-dns-78dd6ddcc-69rxm\" (UID: \"5cb1d966-13e1-4ab1-8707-b06cf4f62036\") " pod="openstack/dnsmasq-dns-78dd6ddcc-69rxm" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.371786 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vmc2\" (UniqueName: \"kubernetes.io/projected/de25eb29-6bef-4d9c-9078-35fb4dc5dc69-kube-api-access-4vmc2\") pod \"dnsmasq-dns-675f4bcbfc-jfgpx\" (UID: \"de25eb29-6bef-4d9c-9078-35fb4dc5dc69\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jfgpx" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.371854 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de25eb29-6bef-4d9c-9078-35fb4dc5dc69-config\") pod \"dnsmasq-dns-675f4bcbfc-jfgpx\" (UID: \"de25eb29-6bef-4d9c-9078-35fb4dc5dc69\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jfgpx" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.371874 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb1d966-13e1-4ab1-8707-b06cf4f62036-config\") pod \"dnsmasq-dns-78dd6ddcc-69rxm\" (UID: \"5cb1d966-13e1-4ab1-8707-b06cf4f62036\") " pod="openstack/dnsmasq-dns-78dd6ddcc-69rxm" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.472907 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cb1d966-13e1-4ab1-8707-b06cf4f62036-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-69rxm\" (UID: \"5cb1d966-13e1-4ab1-8707-b06cf4f62036\") " pod="openstack/dnsmasq-dns-78dd6ddcc-69rxm" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.472950 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ljt8\" (UniqueName: \"kubernetes.io/projected/5cb1d966-13e1-4ab1-8707-b06cf4f62036-kube-api-access-6ljt8\") pod \"dnsmasq-dns-78dd6ddcc-69rxm\" (UID: \"5cb1d966-13e1-4ab1-8707-b06cf4f62036\") " pod="openstack/dnsmasq-dns-78dd6ddcc-69rxm" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.472999 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vmc2\" (UniqueName: \"kubernetes.io/projected/de25eb29-6bef-4d9c-9078-35fb4dc5dc69-kube-api-access-4vmc2\") pod \"dnsmasq-dns-675f4bcbfc-jfgpx\" (UID: \"de25eb29-6bef-4d9c-9078-35fb4dc5dc69\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jfgpx" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.473023 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de25eb29-6bef-4d9c-9078-35fb4dc5dc69-config\") pod \"dnsmasq-dns-675f4bcbfc-jfgpx\" (UID: \"de25eb29-6bef-4d9c-9078-35fb4dc5dc69\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jfgpx" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.473040 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb1d966-13e1-4ab1-8707-b06cf4f62036-config\") pod \"dnsmasq-dns-78dd6ddcc-69rxm\" (UID: \"5cb1d966-13e1-4ab1-8707-b06cf4f62036\") " pod="openstack/dnsmasq-dns-78dd6ddcc-69rxm" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.473870 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb1d966-13e1-4ab1-8707-b06cf4f62036-config\") pod \"dnsmasq-dns-78dd6ddcc-69rxm\" (UID: \"5cb1d966-13e1-4ab1-8707-b06cf4f62036\") " pod="openstack/dnsmasq-dns-78dd6ddcc-69rxm" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.474190 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de25eb29-6bef-4d9c-9078-35fb4dc5dc69-config\") pod \"dnsmasq-dns-675f4bcbfc-jfgpx\" (UID: \"de25eb29-6bef-4d9c-9078-35fb4dc5dc69\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jfgpx" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.474237 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cb1d966-13e1-4ab1-8707-b06cf4f62036-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-69rxm\" (UID: \"5cb1d966-13e1-4ab1-8707-b06cf4f62036\") " pod="openstack/dnsmasq-dns-78dd6ddcc-69rxm" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.495253 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vmc2\" (UniqueName: \"kubernetes.io/projected/de25eb29-6bef-4d9c-9078-35fb4dc5dc69-kube-api-access-4vmc2\") pod \"dnsmasq-dns-675f4bcbfc-jfgpx\" (UID: \"de25eb29-6bef-4d9c-9078-35fb4dc5dc69\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jfgpx" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.496441 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ljt8\" (UniqueName: \"kubernetes.io/projected/5cb1d966-13e1-4ab1-8707-b06cf4f62036-kube-api-access-6ljt8\") pod \"dnsmasq-dns-78dd6ddcc-69rxm\" (UID: \"5cb1d966-13e1-4ab1-8707-b06cf4f62036\") " pod="openstack/dnsmasq-dns-78dd6ddcc-69rxm" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.522387 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jfgpx" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.581066 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-69rxm" Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.965459 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jfgpx"] Feb 24 03:11:26 crc kubenswrapper[4923]: I0224 03:11:26.972080 4923 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 03:11:27 crc kubenswrapper[4923]: I0224 03:11:27.065515 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-jfgpx" event={"ID":"de25eb29-6bef-4d9c-9078-35fb4dc5dc69","Type":"ContainerStarted","Data":"f3e6d7c0fcfa0f400cf6e52671f1c3caffe2f933761f3c9554e1b7b086a293b0"} Feb 24 03:11:27 crc kubenswrapper[4923]: W0224 03:11:27.066999 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cb1d966_13e1_4ab1_8707_b06cf4f62036.slice/crio-48cc35ad5877f3fa0190ad1e038b8efdcf84c39995d49fc9d03340fec5145494 WatchSource:0}: Error finding container 48cc35ad5877f3fa0190ad1e038b8efdcf84c39995d49fc9d03340fec5145494: Status 404 returned error can't find the container with id 48cc35ad5877f3fa0190ad1e038b8efdcf84c39995d49fc9d03340fec5145494 Feb 24 03:11:27 crc kubenswrapper[4923]: I0224 03:11:27.067414 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-69rxm"] Feb 24 03:11:28 crc kubenswrapper[4923]: I0224 03:11:28.078456 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-69rxm" event={"ID":"5cb1d966-13e1-4ab1-8707-b06cf4f62036","Type":"ContainerStarted","Data":"48cc35ad5877f3fa0190ad1e038b8efdcf84c39995d49fc9d03340fec5145494"} Feb 24 03:11:28 crc kubenswrapper[4923]: I0224 03:11:28.753011 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jfgpx"] Feb 24 03:11:28 crc kubenswrapper[4923]: I0224 03:11:28.763734 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rhvwd"] Feb 24 03:11:28 crc kubenswrapper[4923]: I0224 03:11:28.765487 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" Feb 24 03:11:28 crc kubenswrapper[4923]: I0224 03:11:28.774972 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rhvwd"] Feb 24 03:11:28 crc kubenswrapper[4923]: I0224 03:11:28.941054 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4795fab-2fa1-4b70-b354-adf47ecf0575-config\") pod \"dnsmasq-dns-666b6646f7-rhvwd\" (UID: \"a4795fab-2fa1-4b70-b354-adf47ecf0575\") " pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" Feb 24 03:11:28 crc kubenswrapper[4923]: I0224 03:11:28.941127 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4795fab-2fa1-4b70-b354-adf47ecf0575-dns-svc\") pod \"dnsmasq-dns-666b6646f7-rhvwd\" (UID: \"a4795fab-2fa1-4b70-b354-adf47ecf0575\") " pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" Feb 24 03:11:28 crc kubenswrapper[4923]: I0224 03:11:28.941172 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkw8d\" (UniqueName: \"kubernetes.io/projected/a4795fab-2fa1-4b70-b354-adf47ecf0575-kube-api-access-tkw8d\") pod \"dnsmasq-dns-666b6646f7-rhvwd\" (UID: \"a4795fab-2fa1-4b70-b354-adf47ecf0575\") " pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.038873 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-69rxm"] Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.042006 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4795fab-2fa1-4b70-b354-adf47ecf0575-config\") pod \"dnsmasq-dns-666b6646f7-rhvwd\" (UID: \"a4795fab-2fa1-4b70-b354-adf47ecf0575\") " pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.042080 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4795fab-2fa1-4b70-b354-adf47ecf0575-dns-svc\") pod \"dnsmasq-dns-666b6646f7-rhvwd\" (UID: \"a4795fab-2fa1-4b70-b354-adf47ecf0575\") " pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.042126 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkw8d\" (UniqueName: \"kubernetes.io/projected/a4795fab-2fa1-4b70-b354-adf47ecf0575-kube-api-access-tkw8d\") pod \"dnsmasq-dns-666b6646f7-rhvwd\" (UID: \"a4795fab-2fa1-4b70-b354-adf47ecf0575\") " pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.043457 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4795fab-2fa1-4b70-b354-adf47ecf0575-config\") pod \"dnsmasq-dns-666b6646f7-rhvwd\" (UID: \"a4795fab-2fa1-4b70-b354-adf47ecf0575\") " pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.043828 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4795fab-2fa1-4b70-b354-adf47ecf0575-dns-svc\") pod \"dnsmasq-dns-666b6646f7-rhvwd\" (UID: \"a4795fab-2fa1-4b70-b354-adf47ecf0575\") " pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.060038 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gjls5"] Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.061103 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.077231 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkw8d\" (UniqueName: \"kubernetes.io/projected/a4795fab-2fa1-4b70-b354-adf47ecf0575-kube-api-access-tkw8d\") pod \"dnsmasq-dns-666b6646f7-rhvwd\" (UID: \"a4795fab-2fa1-4b70-b354-adf47ecf0575\") " pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.087965 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gjls5"] Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.095767 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.245505 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c-config\") pod \"dnsmasq-dns-57d769cc4f-gjls5\" (UID: \"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c\") " pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.245902 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gjls5\" (UID: \"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c\") " pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.245928 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djlss\" (UniqueName: \"kubernetes.io/projected/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c-kube-api-access-djlss\") pod \"dnsmasq-dns-57d769cc4f-gjls5\" (UID: \"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c\") " pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.352168 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gjls5\" (UID: \"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c\") " pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.352211 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djlss\" (UniqueName: \"kubernetes.io/projected/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c-kube-api-access-djlss\") pod \"dnsmasq-dns-57d769cc4f-gjls5\" (UID: \"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c\") " pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.352605 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c-config\") pod \"dnsmasq-dns-57d769cc4f-gjls5\" (UID: \"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c\") " pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.353346 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gjls5\" (UID: \"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c\") " pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.353881 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c-config\") pod \"dnsmasq-dns-57d769cc4f-gjls5\" (UID: \"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c\") " pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.371908 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djlss\" (UniqueName: \"kubernetes.io/projected/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c-kube-api-access-djlss\") pod \"dnsmasq-dns-57d769cc4f-gjls5\" (UID: \"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c\") " pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.421878 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.578458 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rhvwd"] Feb 24 03:11:29 crc kubenswrapper[4923]: W0224 03:11:29.583646 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4795fab_2fa1_4b70_b354_adf47ecf0575.slice/crio-73b2fd4f9c75eb5a55ee14db33a4098344e9d3f86f120bb6bac64789363eca06 WatchSource:0}: Error finding container 73b2fd4f9c75eb5a55ee14db33a4098344e9d3f86f120bb6bac64789363eca06: Status 404 returned error can't find the container with id 73b2fd4f9c75eb5a55ee14db33a4098344e9d3f86f120bb6bac64789363eca06 Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.781230 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gjls5"] Feb 24 03:11:29 crc kubenswrapper[4923]: W0224 03:11:29.792949 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fb6a340_692e_43b8_aa57_c7bb67f0ba7c.slice/crio-ec6673eba3cd6ef33121c0d19b5d55790da08557bdbf61f6d13b251fe1d6356f WatchSource:0}: Error finding container ec6673eba3cd6ef33121c0d19b5d55790da08557bdbf61f6d13b251fe1d6356f: Status 404 returned error can't find the container with id ec6673eba3cd6ef33121c0d19b5d55790da08557bdbf61f6d13b251fe1d6356f Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.940698 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.942551 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.949845 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.950041 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.950731 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jmcj4" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.951518 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.952043 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.952241 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.952484 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 24 03:11:29 crc kubenswrapper[4923]: I0224 03:11:29.952704 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.076239 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.076276 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.076438 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.076534 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-config-data\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.076557 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.076590 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.076618 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.076635 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.076839 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.076901 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.076923 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mfkt\" (UniqueName: \"kubernetes.io/projected/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-kube-api-access-8mfkt\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.093884 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" event={"ID":"a4795fab-2fa1-4b70-b354-adf47ecf0575","Type":"ContainerStarted","Data":"73b2fd4f9c75eb5a55ee14db33a4098344e9d3f86f120bb6bac64789363eca06"} Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.097754 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" event={"ID":"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c","Type":"ContainerStarted","Data":"ec6673eba3cd6ef33121c0d19b5d55790da08557bdbf61f6d13b251fe1d6356f"} Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.179355 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.179406 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.179428 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mfkt\" (UniqueName: \"kubernetes.io/projected/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-kube-api-access-8mfkt\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.179466 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.179483 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.179504 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.179534 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-config-data\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.179548 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.179580 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.179606 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.179624 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.179799 4923 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.198117 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.198788 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.201535 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.203608 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.206215 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mfkt\" (UniqueName: \"kubernetes.io/projected/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-kube-api-access-8mfkt\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.211508 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-config-data\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.213412 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.215542 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.215569 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.217715 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.218067 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.218328 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5hz2x" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.218486 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.218594 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.218686 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.218811 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.232681 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.233076 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.233489 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.248140 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.271002 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.286238 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.382990 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.383029 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.383064 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.383100 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fg5q\" (UniqueName: \"kubernetes.io/projected/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-kube-api-access-4fg5q\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.383130 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.383150 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.383164 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.383185 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.383218 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.383237 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.383262 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.484182 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.484555 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.484596 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.484632 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fg5q\" (UniqueName: \"kubernetes.io/projected/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-kube-api-access-4fg5q\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.484663 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.484684 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.484701 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.484718 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.484743 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.484768 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.484793 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.485246 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.485710 4923 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.485906 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.489471 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.490250 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.490376 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.491034 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.496382 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.502017 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.503139 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.508868 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fg5q\" (UniqueName: \"kubernetes.io/projected/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-kube-api-access-4fg5q\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.519767 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.617123 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:11:30 crc kubenswrapper[4923]: I0224 03:11:30.781856 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 24 03:11:30 crc kubenswrapper[4923]: W0224 03:11:30.790827 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9950d0b_d980_4e4f_82b4_9f616c6c50a3.slice/crio-01fafa67e410af4f22a25e272dd86fd6d5177dcd60850a395c39f049657b860c WatchSource:0}: Error finding container 01fafa67e410af4f22a25e272dd86fd6d5177dcd60850a395c39f049657b860c: Status 404 returned error can't find the container with id 01fafa67e410af4f22a25e272dd86fd6d5177dcd60850a395c39f049657b860c Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.110816 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d9950d0b-d980-4e4f-82b4-9f616c6c50a3","Type":"ContainerStarted","Data":"01fafa67e410af4f22a25e272dd86fd6d5177dcd60850a395c39f049657b860c"} Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.226200 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.323830 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.340856 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.340951 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.350882 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.351221 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.355388 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bm4c6" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.356088 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.357568 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.397867 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71ebe37b-5831-4545-8f6a-8db6e194982f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.397923 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wj4n\" (UniqueName: \"kubernetes.io/projected/71ebe37b-5831-4545-8f6a-8db6e194982f-kube-api-access-2wj4n\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.397980 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/71ebe37b-5831-4545-8f6a-8db6e194982f-kolla-config\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.398023 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/71ebe37b-5831-4545-8f6a-8db6e194982f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.398044 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/71ebe37b-5831-4545-8f6a-8db6e194982f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.398079 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.398102 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/71ebe37b-5831-4545-8f6a-8db6e194982f-config-data-default\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.398187 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71ebe37b-5831-4545-8f6a-8db6e194982f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.499764 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.499818 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/71ebe37b-5831-4545-8f6a-8db6e194982f-config-data-default\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.499847 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71ebe37b-5831-4545-8f6a-8db6e194982f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.499904 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71ebe37b-5831-4545-8f6a-8db6e194982f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.499933 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wj4n\" (UniqueName: \"kubernetes.io/projected/71ebe37b-5831-4545-8f6a-8db6e194982f-kube-api-access-2wj4n\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.500003 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/71ebe37b-5831-4545-8f6a-8db6e194982f-kolla-config\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.500038 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/71ebe37b-5831-4545-8f6a-8db6e194982f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.500153 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/71ebe37b-5831-4545-8f6a-8db6e194982f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.500773 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/71ebe37b-5831-4545-8f6a-8db6e194982f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.501175 4923 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.509965 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/71ebe37b-5831-4545-8f6a-8db6e194982f-config-data-default\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.516635 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71ebe37b-5831-4545-8f6a-8db6e194982f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.527612 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/71ebe37b-5831-4545-8f6a-8db6e194982f-kolla-config\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.528713 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/71ebe37b-5831-4545-8f6a-8db6e194982f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.533762 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71ebe37b-5831-4545-8f6a-8db6e194982f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.537394 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wj4n\" (UniqueName: \"kubernetes.io/projected/71ebe37b-5831-4545-8f6a-8db6e194982f-kube-api-access-2wj4n\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.540737 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"71ebe37b-5831-4545-8f6a-8db6e194982f\") " pod="openstack/openstack-galera-0" Feb 24 03:11:31 crc kubenswrapper[4923]: I0224 03:11:31.670823 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.705847 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.707327 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.709284 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.709463 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-sth9j" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.709763 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.709878 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.711584 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.826997 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.827053 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.827210 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.827252 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.827347 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.827797 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.827875 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.827920 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxw2g\" (UniqueName: \"kubernetes.io/projected/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-kube-api-access-sxw2g\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.932128 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.932189 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxw2g\" (UniqueName: \"kubernetes.io/projected/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-kube-api-access-sxw2g\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.932233 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.932260 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.932467 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.932488 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.932515 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.932548 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.932626 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.932980 4923 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.933421 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.934138 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.936848 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.938265 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.943726 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.953253 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxw2g\" (UniqueName: \"kubernetes.io/projected/b2879b26-9173-4d23-b6f4-9c9e4c43f08e-kube-api-access-sxw2g\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.976540 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 24 03:11:32 crc kubenswrapper[4923]: I0224 03:11:32.977529 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.001883 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.002457 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.002567 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-nzxsb" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.009562 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b2879b26-9173-4d23-b6f4-9c9e4c43f08e\") " pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.012519 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.034190 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.035002 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tk2v\" (UniqueName: \"kubernetes.io/projected/2c9c3801-205a-40fd-929f-587f5aaa9ca2-kube-api-access-4tk2v\") pod \"memcached-0\" (UID: \"2c9c3801-205a-40fd-929f-587f5aaa9ca2\") " pod="openstack/memcached-0" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.035140 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c9c3801-205a-40fd-929f-587f5aaa9ca2-config-data\") pod \"memcached-0\" (UID: \"2c9c3801-205a-40fd-929f-587f5aaa9ca2\") " pod="openstack/memcached-0" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.035200 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c9c3801-205a-40fd-929f-587f5aaa9ca2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2c9c3801-205a-40fd-929f-587f5aaa9ca2\") " pod="openstack/memcached-0" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.035266 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2c9c3801-205a-40fd-929f-587f5aaa9ca2-kolla-config\") pod \"memcached-0\" (UID: \"2c9c3801-205a-40fd-929f-587f5aaa9ca2\") " pod="openstack/memcached-0" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.035433 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c9c3801-205a-40fd-929f-587f5aaa9ca2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2c9c3801-205a-40fd-929f-587f5aaa9ca2\") " pod="openstack/memcached-0" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.136860 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2c9c3801-205a-40fd-929f-587f5aaa9ca2-kolla-config\") pod \"memcached-0\" (UID: \"2c9c3801-205a-40fd-929f-587f5aaa9ca2\") " pod="openstack/memcached-0" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.137014 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c9c3801-205a-40fd-929f-587f5aaa9ca2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2c9c3801-205a-40fd-929f-587f5aaa9ca2\") " pod="openstack/memcached-0" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.137051 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tk2v\" (UniqueName: \"kubernetes.io/projected/2c9c3801-205a-40fd-929f-587f5aaa9ca2-kube-api-access-4tk2v\") pod \"memcached-0\" (UID: \"2c9c3801-205a-40fd-929f-587f5aaa9ca2\") " pod="openstack/memcached-0" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.137072 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c9c3801-205a-40fd-929f-587f5aaa9ca2-config-data\") pod \"memcached-0\" (UID: \"2c9c3801-205a-40fd-929f-587f5aaa9ca2\") " pod="openstack/memcached-0" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.137107 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c9c3801-205a-40fd-929f-587f5aaa9ca2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2c9c3801-205a-40fd-929f-587f5aaa9ca2\") " pod="openstack/memcached-0" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.137913 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2c9c3801-205a-40fd-929f-587f5aaa9ca2-kolla-config\") pod \"memcached-0\" (UID: \"2c9c3801-205a-40fd-929f-587f5aaa9ca2\") " pod="openstack/memcached-0" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.138063 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c9c3801-205a-40fd-929f-587f5aaa9ca2-config-data\") pod \"memcached-0\" (UID: \"2c9c3801-205a-40fd-929f-587f5aaa9ca2\") " pod="openstack/memcached-0" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.147524 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c9c3801-205a-40fd-929f-587f5aaa9ca2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2c9c3801-205a-40fd-929f-587f5aaa9ca2\") " pod="openstack/memcached-0" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.148787 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c9c3801-205a-40fd-929f-587f5aaa9ca2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2c9c3801-205a-40fd-929f-587f5aaa9ca2\") " pod="openstack/memcached-0" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.158831 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tk2v\" (UniqueName: \"kubernetes.io/projected/2c9c3801-205a-40fd-929f-587f5aaa9ca2-kube-api-access-4tk2v\") pod \"memcached-0\" (UID: \"2c9c3801-205a-40fd-929f-587f5aaa9ca2\") " pod="openstack/memcached-0" Feb 24 03:11:33 crc kubenswrapper[4923]: I0224 03:11:33.343174 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 24 03:11:35 crc kubenswrapper[4923]: I0224 03:11:35.359641 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 24 03:11:35 crc kubenswrapper[4923]: I0224 03:11:35.360513 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 24 03:11:35 crc kubenswrapper[4923]: I0224 03:11:35.362730 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-khzpq" Feb 24 03:11:35 crc kubenswrapper[4923]: I0224 03:11:35.378642 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 24 03:11:35 crc kubenswrapper[4923]: I0224 03:11:35.472336 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f2lz\" (UniqueName: \"kubernetes.io/projected/a6bbccab-2bc6-47d9-b1d1-896c661112d8-kube-api-access-5f2lz\") pod \"kube-state-metrics-0\" (UID: \"a6bbccab-2bc6-47d9-b1d1-896c661112d8\") " pod="openstack/kube-state-metrics-0" Feb 24 03:11:35 crc kubenswrapper[4923]: I0224 03:11:35.574402 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f2lz\" (UniqueName: \"kubernetes.io/projected/a6bbccab-2bc6-47d9-b1d1-896c661112d8-kube-api-access-5f2lz\") pod \"kube-state-metrics-0\" (UID: \"a6bbccab-2bc6-47d9-b1d1-896c661112d8\") " pod="openstack/kube-state-metrics-0" Feb 24 03:11:35 crc kubenswrapper[4923]: W0224 03:11:35.588421 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a3b0fbb_1a43_4a8b_9c15_dff4832fbba0.slice/crio-bacef2b23fac862774c61524ba1bb06b4e1a584c35d7b2ef35bd60d0fed2d8b3 WatchSource:0}: Error finding container bacef2b23fac862774c61524ba1bb06b4e1a584c35d7b2ef35bd60d0fed2d8b3: Status 404 returned error can't find the container with id bacef2b23fac862774c61524ba1bb06b4e1a584c35d7b2ef35bd60d0fed2d8b3 Feb 24 03:11:35 crc kubenswrapper[4923]: I0224 03:11:35.609277 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f2lz\" (UniqueName: \"kubernetes.io/projected/a6bbccab-2bc6-47d9-b1d1-896c661112d8-kube-api-access-5f2lz\") pod \"kube-state-metrics-0\" (UID: \"a6bbccab-2bc6-47d9-b1d1-896c661112d8\") " pod="openstack/kube-state-metrics-0" Feb 24 03:11:35 crc kubenswrapper[4923]: I0224 03:11:35.739857 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 24 03:11:36 crc kubenswrapper[4923]: I0224 03:11:36.154180 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0","Type":"ContainerStarted","Data":"bacef2b23fac862774c61524ba1bb06b4e1a584c35d7b2ef35bd60d0fed2d8b3"} Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.296093 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6l624"] Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.297538 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.300237 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-gh6b5" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.300287 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.300425 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.310210 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-555wh"] Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.311566 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.329211 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-555wh"] Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.357644 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6l624"] Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.424373 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-combined-ca-bundle\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.424419 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxgh4\" (UniqueName: \"kubernetes.io/projected/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-kube-api-access-gxgh4\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.424458 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/22ccde27-2e54-4d62-8cc4-8b12ea5e92a7-var-run\") pod \"ovn-controller-ovs-555wh\" (UID: \"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7\") " pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.424493 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/22ccde27-2e54-4d62-8cc4-8b12ea5e92a7-var-lib\") pod \"ovn-controller-ovs-555wh\" (UID: \"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7\") " pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.424511 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-ovn-controller-tls-certs\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.424528 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/22ccde27-2e54-4d62-8cc4-8b12ea5e92a7-var-log\") pod \"ovn-controller-ovs-555wh\" (UID: \"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7\") " pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.424570 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-scripts\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.424588 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-var-run\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.424604 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-var-run-ovn\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.424622 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fwcd\" (UniqueName: \"kubernetes.io/projected/22ccde27-2e54-4d62-8cc4-8b12ea5e92a7-kube-api-access-5fwcd\") pod \"ovn-controller-ovs-555wh\" (UID: \"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7\") " pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.424651 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-var-log-ovn\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.424675 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/22ccde27-2e54-4d62-8cc4-8b12ea5e92a7-etc-ovs\") pod \"ovn-controller-ovs-555wh\" (UID: \"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7\") " pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.424691 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22ccde27-2e54-4d62-8cc4-8b12ea5e92a7-scripts\") pod \"ovn-controller-ovs-555wh\" (UID: \"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7\") " pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.526425 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-scripts\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.526506 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-var-run\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.526551 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-var-run-ovn\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.526580 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fwcd\" (UniqueName: \"kubernetes.io/projected/22ccde27-2e54-4d62-8cc4-8b12ea5e92a7-kube-api-access-5fwcd\") pod \"ovn-controller-ovs-555wh\" (UID: \"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7\") " pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.526652 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-var-log-ovn\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.526711 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/22ccde27-2e54-4d62-8cc4-8b12ea5e92a7-etc-ovs\") pod \"ovn-controller-ovs-555wh\" (UID: \"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7\") " pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.526732 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22ccde27-2e54-4d62-8cc4-8b12ea5e92a7-scripts\") pod \"ovn-controller-ovs-555wh\" (UID: \"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7\") " pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.526757 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-combined-ca-bundle\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.526772 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxgh4\" (UniqueName: \"kubernetes.io/projected/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-kube-api-access-gxgh4\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.526786 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/22ccde27-2e54-4d62-8cc4-8b12ea5e92a7-var-run\") pod \"ovn-controller-ovs-555wh\" (UID: \"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7\") " pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.526814 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/22ccde27-2e54-4d62-8cc4-8b12ea5e92a7-var-lib\") pod \"ovn-controller-ovs-555wh\" (UID: \"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7\") " pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.526831 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-ovn-controller-tls-certs\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.526851 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/22ccde27-2e54-4d62-8cc4-8b12ea5e92a7-var-log\") pod \"ovn-controller-ovs-555wh\" (UID: \"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7\") " pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.529771 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22ccde27-2e54-4d62-8cc4-8b12ea5e92a7-scripts\") pod \"ovn-controller-ovs-555wh\" (UID: \"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7\") " pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.530623 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-var-run\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.530725 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/22ccde27-2e54-4d62-8cc4-8b12ea5e92a7-var-log\") pod \"ovn-controller-ovs-555wh\" (UID: \"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7\") " pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.531098 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/22ccde27-2e54-4d62-8cc4-8b12ea5e92a7-var-run\") pod \"ovn-controller-ovs-555wh\" (UID: \"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7\") " pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.531165 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-var-run-ovn\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.531188 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-var-log-ovn\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.531213 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/22ccde27-2e54-4d62-8cc4-8b12ea5e92a7-etc-ovs\") pod \"ovn-controller-ovs-555wh\" (UID: \"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7\") " pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.531262 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/22ccde27-2e54-4d62-8cc4-8b12ea5e92a7-var-lib\") pod \"ovn-controller-ovs-555wh\" (UID: \"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7\") " pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.531977 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-scripts\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.533513 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-ovn-controller-tls-certs\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.538050 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-combined-ca-bundle\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.547065 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fwcd\" (UniqueName: \"kubernetes.io/projected/22ccde27-2e54-4d62-8cc4-8b12ea5e92a7-kube-api-access-5fwcd\") pod \"ovn-controller-ovs-555wh\" (UID: \"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7\") " pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.547928 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxgh4\" (UniqueName: \"kubernetes.io/projected/6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095-kube-api-access-gxgh4\") pod \"ovn-controller-6l624\" (UID: \"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095\") " pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.617780 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6l624" Feb 24 03:11:38 crc kubenswrapper[4923]: I0224 03:11:38.627913 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.156203 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.157652 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.159981 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.160183 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-hhhrj" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.160347 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.160518 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.161638 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.182887 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.278329 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.278410 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c976efe6-239a-4f24-a392-b1b5ba3545de-config\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.278501 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c976efe6-239a-4f24-a392-b1b5ba3545de-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.278534 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c976efe6-239a-4f24-a392-b1b5ba3545de-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.278607 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c976efe6-239a-4f24-a392-b1b5ba3545de-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.278663 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c976efe6-239a-4f24-a392-b1b5ba3545de-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.278748 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wvfv\" (UniqueName: \"kubernetes.io/projected/c976efe6-239a-4f24-a392-b1b5ba3545de-kube-api-access-7wvfv\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.278765 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c976efe6-239a-4f24-a392-b1b5ba3545de-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.380341 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c976efe6-239a-4f24-a392-b1b5ba3545de-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.380410 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wvfv\" (UniqueName: \"kubernetes.io/projected/c976efe6-239a-4f24-a392-b1b5ba3545de-kube-api-access-7wvfv\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.380521 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.380558 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c976efe6-239a-4f24-a392-b1b5ba3545de-config\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.380633 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c976efe6-239a-4f24-a392-b1b5ba3545de-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.380675 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c976efe6-239a-4f24-a392-b1b5ba3545de-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.380733 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c976efe6-239a-4f24-a392-b1b5ba3545de-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.380783 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c976efe6-239a-4f24-a392-b1b5ba3545de-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.381091 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c976efe6-239a-4f24-a392-b1b5ba3545de-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.381662 4923 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.381770 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c976efe6-239a-4f24-a392-b1b5ba3545de-config\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.381854 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c976efe6-239a-4f24-a392-b1b5ba3545de-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.386822 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c976efe6-239a-4f24-a392-b1b5ba3545de-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.392637 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c976efe6-239a-4f24-a392-b1b5ba3545de-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.392964 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c976efe6-239a-4f24-a392-b1b5ba3545de-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.399856 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wvfv\" (UniqueName: \"kubernetes.io/projected/c976efe6-239a-4f24-a392-b1b5ba3545de-kube-api-access-7wvfv\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.404063 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c976efe6-239a-4f24-a392-b1b5ba3545de\") " pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.508787 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.837250 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.840253 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.842937 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.843435 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-gt7fp" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.843537 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.843479 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.864049 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.994054 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/34d16b71-0cf5-4143-9225-3e44441dc2da-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.994092 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.994119 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d16b71-0cf5-4143-9225-3e44441dc2da-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.994167 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/34d16b71-0cf5-4143-9225-3e44441dc2da-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.994190 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d16b71-0cf5-4143-9225-3e44441dc2da-config\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.994221 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34d16b71-0cf5-4143-9225-3e44441dc2da-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.994253 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/34d16b71-0cf5-4143-9225-3e44441dc2da-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:41 crc kubenswrapper[4923]: I0224 03:11:41.994482 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm57x\" (UniqueName: \"kubernetes.io/projected/34d16b71-0cf5-4143-9225-3e44441dc2da-kube-api-access-nm57x\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:42 crc kubenswrapper[4923]: I0224 03:11:42.096476 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d16b71-0cf5-4143-9225-3e44441dc2da-config\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:42 crc kubenswrapper[4923]: I0224 03:11:42.096545 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34d16b71-0cf5-4143-9225-3e44441dc2da-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:42 crc kubenswrapper[4923]: I0224 03:11:42.096593 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/34d16b71-0cf5-4143-9225-3e44441dc2da-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:42 crc kubenswrapper[4923]: I0224 03:11:42.096628 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm57x\" (UniqueName: \"kubernetes.io/projected/34d16b71-0cf5-4143-9225-3e44441dc2da-kube-api-access-nm57x\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:42 crc kubenswrapper[4923]: I0224 03:11:42.096675 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/34d16b71-0cf5-4143-9225-3e44441dc2da-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:42 crc kubenswrapper[4923]: I0224 03:11:42.096702 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:42 crc kubenswrapper[4923]: I0224 03:11:42.096732 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d16b71-0cf5-4143-9225-3e44441dc2da-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:42 crc kubenswrapper[4923]: I0224 03:11:42.096793 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/34d16b71-0cf5-4143-9225-3e44441dc2da-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:42 crc kubenswrapper[4923]: I0224 03:11:42.097259 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/34d16b71-0cf5-4143-9225-3e44441dc2da-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:42 crc kubenswrapper[4923]: I0224 03:11:42.098200 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/34d16b71-0cf5-4143-9225-3e44441dc2da-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:42 crc kubenswrapper[4923]: I0224 03:11:42.098369 4923 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:42 crc kubenswrapper[4923]: I0224 03:11:42.101589 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/34d16b71-0cf5-4143-9225-3e44441dc2da-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:42 crc kubenswrapper[4923]: I0224 03:11:42.101598 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d16b71-0cf5-4143-9225-3e44441dc2da-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:42 crc kubenswrapper[4923]: I0224 03:11:42.101662 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/34d16b71-0cf5-4143-9225-3e44441dc2da-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:42 crc kubenswrapper[4923]: I0224 03:11:42.101989 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d16b71-0cf5-4143-9225-3e44441dc2da-config\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:42 crc kubenswrapper[4923]: I0224 03:11:42.118133 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:42 crc kubenswrapper[4923]: I0224 03:11:42.120740 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm57x\" (UniqueName: \"kubernetes.io/projected/34d16b71-0cf5-4143-9225-3e44441dc2da-kube-api-access-nm57x\") pod \"ovsdbserver-nb-0\" (UID: \"34d16b71-0cf5-4143-9225-3e44441dc2da\") " pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:42 crc kubenswrapper[4923]: I0224 03:11:42.168282 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 24 03:11:45 crc kubenswrapper[4923]: E0224 03:11:45.907729 4923 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 24 03:11:45 crc kubenswrapper[4923]: E0224 03:11:45.908211 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4vmc2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-jfgpx_openstack(de25eb29-6bef-4d9c-9078-35fb4dc5dc69): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 24 03:11:45 crc kubenswrapper[4923]: E0224 03:11:45.909461 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-jfgpx" podUID="de25eb29-6bef-4d9c-9078-35fb4dc5dc69" Feb 24 03:11:45 crc kubenswrapper[4923]: E0224 03:11:45.929398 4923 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 24 03:11:45 crc kubenswrapper[4923]: E0224 03:11:45.929630 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ljt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-69rxm_openstack(5cb1d966-13e1-4ab1-8707-b06cf4f62036): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 24 03:11:45 crc kubenswrapper[4923]: E0224 03:11:45.930902 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-69rxm" podUID="5cb1d966-13e1-4ab1-8707-b06cf4f62036" Feb 24 03:11:45 crc kubenswrapper[4923]: E0224 03:11:45.939477 4923 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 24 03:11:45 crc kubenswrapper[4923]: E0224 03:11:45.939656 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tkw8d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-rhvwd_openstack(a4795fab-2fa1-4b70-b354-adf47ecf0575): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 24 03:11:45 crc kubenswrapper[4923]: E0224 03:11:45.940926 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" podUID="a4795fab-2fa1-4b70-b354-adf47ecf0575" Feb 24 03:11:45 crc kubenswrapper[4923]: E0224 03:11:45.968764 4923 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 24 03:11:45 crc kubenswrapper[4923]: E0224 03:11:45.968919 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-djlss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-gjls5_openstack(8fb6a340-692e-43b8-aa57-c7bb67f0ba7c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 24 03:11:45 crc kubenswrapper[4923]: E0224 03:11:45.970154 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" podUID="8fb6a340-692e-43b8-aa57-c7bb67f0ba7c" Feb 24 03:11:46 crc kubenswrapper[4923]: E0224 03:11:46.231032 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" podUID="a4795fab-2fa1-4b70-b354-adf47ecf0575" Feb 24 03:11:46 crc kubenswrapper[4923]: E0224 03:11:46.232400 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" podUID="8fb6a340-692e-43b8-aa57-c7bb67f0ba7c" Feb 24 03:11:46 crc kubenswrapper[4923]: I0224 03:11:46.326821 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.245706 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b2879b26-9173-4d23-b6f4-9c9e4c43f08e","Type":"ContainerStarted","Data":"612684651be3a7a2ace5ee45f2aa16df735af5b0a14dfb573b160c2a09cba87e"} Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.248476 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-jfgpx" event={"ID":"de25eb29-6bef-4d9c-9078-35fb4dc5dc69","Type":"ContainerDied","Data":"f3e6d7c0fcfa0f400cf6e52671f1c3caffe2f933761f3c9554e1b7b086a293b0"} Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.248519 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3e6d7c0fcfa0f400cf6e52671f1c3caffe2f933761f3c9554e1b7b086a293b0" Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.380018 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jfgpx" Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.401518 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-69rxm" Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.494786 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vmc2\" (UniqueName: \"kubernetes.io/projected/de25eb29-6bef-4d9c-9078-35fb4dc5dc69-kube-api-access-4vmc2\") pod \"de25eb29-6bef-4d9c-9078-35fb4dc5dc69\" (UID: \"de25eb29-6bef-4d9c-9078-35fb4dc5dc69\") " Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.494857 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de25eb29-6bef-4d9c-9078-35fb4dc5dc69-config\") pod \"de25eb29-6bef-4d9c-9078-35fb4dc5dc69\" (UID: \"de25eb29-6bef-4d9c-9078-35fb4dc5dc69\") " Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.494985 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ljt8\" (UniqueName: \"kubernetes.io/projected/5cb1d966-13e1-4ab1-8707-b06cf4f62036-kube-api-access-6ljt8\") pod \"5cb1d966-13e1-4ab1-8707-b06cf4f62036\" (UID: \"5cb1d966-13e1-4ab1-8707-b06cf4f62036\") " Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.495033 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb1d966-13e1-4ab1-8707-b06cf4f62036-config\") pod \"5cb1d966-13e1-4ab1-8707-b06cf4f62036\" (UID: \"5cb1d966-13e1-4ab1-8707-b06cf4f62036\") " Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.495086 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cb1d966-13e1-4ab1-8707-b06cf4f62036-dns-svc\") pod \"5cb1d966-13e1-4ab1-8707-b06cf4f62036\" (UID: \"5cb1d966-13e1-4ab1-8707-b06cf4f62036\") " Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.495567 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cb1d966-13e1-4ab1-8707-b06cf4f62036-config" (OuterVolumeSpecName: "config") pod "5cb1d966-13e1-4ab1-8707-b06cf4f62036" (UID: "5cb1d966-13e1-4ab1-8707-b06cf4f62036"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.495625 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cb1d966-13e1-4ab1-8707-b06cf4f62036-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5cb1d966-13e1-4ab1-8707-b06cf4f62036" (UID: "5cb1d966-13e1-4ab1-8707-b06cf4f62036"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.495567 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de25eb29-6bef-4d9c-9078-35fb4dc5dc69-config" (OuterVolumeSpecName: "config") pod "de25eb29-6bef-4d9c-9078-35fb4dc5dc69" (UID: "de25eb29-6bef-4d9c-9078-35fb4dc5dc69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.495976 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb1d966-13e1-4ab1-8707-b06cf4f62036-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.496001 4923 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cb1d966-13e1-4ab1-8707-b06cf4f62036-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.496015 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de25eb29-6bef-4d9c-9078-35fb4dc5dc69-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.501503 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de25eb29-6bef-4d9c-9078-35fb4dc5dc69-kube-api-access-4vmc2" (OuterVolumeSpecName: "kube-api-access-4vmc2") pod "de25eb29-6bef-4d9c-9078-35fb4dc5dc69" (UID: "de25eb29-6bef-4d9c-9078-35fb4dc5dc69"). InnerVolumeSpecName "kube-api-access-4vmc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.501887 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cb1d966-13e1-4ab1-8707-b06cf4f62036-kube-api-access-6ljt8" (OuterVolumeSpecName: "kube-api-access-6ljt8") pod "5cb1d966-13e1-4ab1-8707-b06cf4f62036" (UID: "5cb1d966-13e1-4ab1-8707-b06cf4f62036"). InnerVolumeSpecName "kube-api-access-6ljt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.597794 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vmc2\" (UniqueName: \"kubernetes.io/projected/de25eb29-6bef-4d9c-9078-35fb4dc5dc69-kube-api-access-4vmc2\") on node \"crc\" DevicePath \"\"" Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.597825 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ljt8\" (UniqueName: \"kubernetes.io/projected/5cb1d966-13e1-4ab1-8707-b06cf4f62036-kube-api-access-6ljt8\") on node \"crc\" DevicePath \"\"" Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.760561 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6l624"] Feb 24 03:11:47 crc kubenswrapper[4923]: I0224 03:11:47.773833 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 24 03:11:48 crc kubenswrapper[4923]: I0224 03:11:48.134841 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 24 03:11:48 crc kubenswrapper[4923]: I0224 03:11:48.153891 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 24 03:11:48 crc kubenswrapper[4923]: I0224 03:11:48.242834 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 24 03:11:48 crc kubenswrapper[4923]: I0224 03:11:48.255131 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-69rxm" Feb 24 03:11:48 crc kubenswrapper[4923]: I0224 03:11:48.255161 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jfgpx" Feb 24 03:11:48 crc kubenswrapper[4923]: I0224 03:11:48.255178 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-69rxm" event={"ID":"5cb1d966-13e1-4ab1-8707-b06cf4f62036","Type":"ContainerDied","Data":"48cc35ad5877f3fa0190ad1e038b8efdcf84c39995d49fc9d03340fec5145494"} Feb 24 03:11:48 crc kubenswrapper[4923]: I0224 03:11:48.306621 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-69rxm"] Feb 24 03:11:48 crc kubenswrapper[4923]: I0224 03:11:48.315621 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-69rxm"] Feb 24 03:11:48 crc kubenswrapper[4923]: I0224 03:11:48.338051 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jfgpx"] Feb 24 03:11:48 crc kubenswrapper[4923]: I0224 03:11:48.347701 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jfgpx"] Feb 24 03:11:49 crc kubenswrapper[4923]: I0224 03:11:49.029563 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-555wh"] Feb 24 03:11:49 crc kubenswrapper[4923]: I0224 03:11:49.121044 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 24 03:11:49 crc kubenswrapper[4923]: I0224 03:11:49.722156 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cb1d966-13e1-4ab1-8707-b06cf4f62036" path="/var/lib/kubelet/pods/5cb1d966-13e1-4ab1-8707-b06cf4f62036/volumes" Feb 24 03:11:49 crc kubenswrapper[4923]: I0224 03:11:49.722529 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de25eb29-6bef-4d9c-9078-35fb4dc5dc69" path="/var/lib/kubelet/pods/de25eb29-6bef-4d9c-9078-35fb4dc5dc69/volumes" Feb 24 03:11:50 crc kubenswrapper[4923]: W0224 03:11:50.472765 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71ebe37b_5831_4545_8f6a_8db6e194982f.slice/crio-91d3c3983b7d7a607c6c64c0ffa22ff6a953fc2886022a1f43471dc9848ebb0c WatchSource:0}: Error finding container 91d3c3983b7d7a607c6c64c0ffa22ff6a953fc2886022a1f43471dc9848ebb0c: Status 404 returned error can't find the container with id 91d3c3983b7d7a607c6c64c0ffa22ff6a953fc2886022a1f43471dc9848ebb0c Feb 24 03:11:50 crc kubenswrapper[4923]: W0224 03:11:50.566749 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22ccde27_2e54_4d62_8cc4_8b12ea5e92a7.slice/crio-8638d3e60f617bc60b53106990db9a17d0cb4eb96136a5e4da7a198185ad5421 WatchSource:0}: Error finding container 8638d3e60f617bc60b53106990db9a17d0cb4eb96136a5e4da7a198185ad5421: Status 404 returned error can't find the container with id 8638d3e60f617bc60b53106990db9a17d0cb4eb96136a5e4da7a198185ad5421 Feb 24 03:11:51 crc kubenswrapper[4923]: I0224 03:11:51.278920 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"71ebe37b-5831-4545-8f6a-8db6e194982f","Type":"ContainerStarted","Data":"91d3c3983b7d7a607c6c64c0ffa22ff6a953fc2886022a1f43471dc9848ebb0c"} Feb 24 03:11:51 crc kubenswrapper[4923]: I0224 03:11:51.280042 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6l624" event={"ID":"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095","Type":"ContainerStarted","Data":"208867412fa665e6fa3fffe41172863e7b0e79aa98c8fed25ad913d91b9dc974"} Feb 24 03:11:51 crc kubenswrapper[4923]: I0224 03:11:51.281206 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a6bbccab-2bc6-47d9-b1d1-896c661112d8","Type":"ContainerStarted","Data":"932c87fcbd737bd40fd9ed2d63d22ddf51e87c53a9ecd3694c4b576936aa9708"} Feb 24 03:11:51 crc kubenswrapper[4923]: I0224 03:11:51.282024 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-555wh" event={"ID":"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7","Type":"ContainerStarted","Data":"8638d3e60f617bc60b53106990db9a17d0cb4eb96136a5e4da7a198185ad5421"} Feb 24 03:11:51 crc kubenswrapper[4923]: I0224 03:11:51.283216 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2c9c3801-205a-40fd-929f-587f5aaa9ca2","Type":"ContainerStarted","Data":"fa84ffc2207e0e835f9eda7814045a624a16946ab991bdda1bddeaf82d83c182"} Feb 24 03:11:51 crc kubenswrapper[4923]: I0224 03:11:51.284224 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"34d16b71-0cf5-4143-9225-3e44441dc2da","Type":"ContainerStarted","Data":"ce52c5bf097d023c4992629e2dc52039047fb5b1aff3302ea55be9110d5c0946"} Feb 24 03:11:51 crc kubenswrapper[4923]: I0224 03:11:51.285331 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c976efe6-239a-4f24-a392-b1b5ba3545de","Type":"ContainerStarted","Data":"a63a33ad0dcdd16cc2bd4bd59ae128cddef61fed51ebe2db305c1c4a9d10c666"} Feb 24 03:11:52 crc kubenswrapper[4923]: I0224 03:11:52.293383 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0","Type":"ContainerStarted","Data":"780d10fe0138b329c1369b0f1cbd1e1e5c8fbef05d9caf5651b24f4d82a4f4d2"} Feb 24 03:11:52 crc kubenswrapper[4923]: I0224 03:11:52.295469 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d9950d0b-d980-4e4f-82b4-9f616c6c50a3","Type":"ContainerStarted","Data":"1ef05ab77af0e174ff2a3a3a25eb2a8838b22904e83d0e1d6e1693dfaaf19763"} Feb 24 03:11:58 crc kubenswrapper[4923]: I0224 03:11:58.373052 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2c9c3801-205a-40fd-929f-587f5aaa9ca2","Type":"ContainerStarted","Data":"ee9a46700c2b756a12e9d772275ddfce43cafdb269004b3027f5b6816e097938"} Feb 24 03:11:58 crc kubenswrapper[4923]: I0224 03:11:58.373693 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 24 03:11:58 crc kubenswrapper[4923]: I0224 03:11:58.375390 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"34d16b71-0cf5-4143-9225-3e44441dc2da","Type":"ContainerStarted","Data":"662b1f481a69d1e6db372aba063061465c18562c1b019a86e645bae13edc572c"} Feb 24 03:11:58 crc kubenswrapper[4923]: I0224 03:11:58.380000 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" event={"ID":"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c","Type":"ContainerStarted","Data":"cb66878a6b71fbbc1dfbed3a66a90d4405ef0e9f31519868df6b69110777bac5"} Feb 24 03:11:58 crc kubenswrapper[4923]: I0224 03:11:58.383036 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c976efe6-239a-4f24-a392-b1b5ba3545de","Type":"ContainerStarted","Data":"1157059554fb5af4a283b06903eb0ec6e812b93407b43af331d7407b54f1dcf5"} Feb 24 03:11:58 crc kubenswrapper[4923]: I0224 03:11:58.384506 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"71ebe37b-5831-4545-8f6a-8db6e194982f","Type":"ContainerStarted","Data":"0d2c9947f54f339626df65be074d58197bdfec8c9175334158e1d36a2fece671"} Feb 24 03:11:58 crc kubenswrapper[4923]: I0224 03:11:58.387739 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=20.254835166 podStartE2EDuration="26.387722305s" podCreationTimestamp="2026-02-24 03:11:32 +0000 UTC" firstStartedPulling="2026-02-24 03:11:50.477730634 +0000 UTC m=+1034.494801447" lastFinishedPulling="2026-02-24 03:11:56.610617743 +0000 UTC m=+1040.627688586" observedRunningTime="2026-02-24 03:11:58.385397884 +0000 UTC m=+1042.402468707" watchObservedRunningTime="2026-02-24 03:11:58.387722305 +0000 UTC m=+1042.404793118" Feb 24 03:11:58 crc kubenswrapper[4923]: I0224 03:11:58.389132 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6l624" event={"ID":"6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095","Type":"ContainerStarted","Data":"e0e5cd4b78503c68759039029b6e80bc039774c674efba7ba0b41054eadff478"} Feb 24 03:11:58 crc kubenswrapper[4923]: I0224 03:11:58.389916 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-6l624" Feb 24 03:11:58 crc kubenswrapper[4923]: I0224 03:11:58.392032 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a6bbccab-2bc6-47d9-b1d1-896c661112d8","Type":"ContainerStarted","Data":"201e4cb7602236ffd3ccf5096bfce73551432c5c938d6f55efdf7aea08dd3dfc"} Feb 24 03:11:58 crc kubenswrapper[4923]: I0224 03:11:58.392316 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 24 03:11:58 crc kubenswrapper[4923]: I0224 03:11:58.394662 4923 generic.go:334] "Generic (PLEG): container finished" podID="22ccde27-2e54-4d62-8cc4-8b12ea5e92a7" containerID="9f0f8572bc1a9ae8023e5f6bef7e7867dbd636585bbd7b78d4068db2d820ee0c" exitCode=0 Feb 24 03:11:58 crc kubenswrapper[4923]: I0224 03:11:58.394726 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-555wh" event={"ID":"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7","Type":"ContainerDied","Data":"9f0f8572bc1a9ae8023e5f6bef7e7867dbd636585bbd7b78d4068db2d820ee0c"} Feb 24 03:11:58 crc kubenswrapper[4923]: I0224 03:11:58.399959 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b2879b26-9173-4d23-b6f4-9c9e4c43f08e","Type":"ContainerStarted","Data":"61aadbc1158217678723884dbfc79b6ae19c1fd6746e91c690cf67d20f709d5f"} Feb 24 03:11:58 crc kubenswrapper[4923]: I0224 03:11:58.486063 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6l624" podStartSLOduration=13.596401802 podStartE2EDuration="20.486046455s" podCreationTimestamp="2026-02-24 03:11:38 +0000 UTC" firstStartedPulling="2026-02-24 03:11:50.470895365 +0000 UTC m=+1034.487966178" lastFinishedPulling="2026-02-24 03:11:57.360540008 +0000 UTC m=+1041.377610831" observedRunningTime="2026-02-24 03:11:58.480836218 +0000 UTC m=+1042.497907051" watchObservedRunningTime="2026-02-24 03:11:58.486046455 +0000 UTC m=+1042.503117268" Feb 24 03:11:59 crc kubenswrapper[4923]: I0224 03:11:59.413012 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-555wh" event={"ID":"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7","Type":"ContainerStarted","Data":"b36aeab28ee5f47ec5868d2be5f1c546142d2e743abb53397800615939156402"} Feb 24 03:11:59 crc kubenswrapper[4923]: I0224 03:11:59.414989 4923 generic.go:334] "Generic (PLEG): container finished" podID="8fb6a340-692e-43b8-aa57-c7bb67f0ba7c" containerID="cb66878a6b71fbbc1dfbed3a66a90d4405ef0e9f31519868df6b69110777bac5" exitCode=0 Feb 24 03:11:59 crc kubenswrapper[4923]: I0224 03:11:59.415099 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" event={"ID":"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c","Type":"ContainerDied","Data":"cb66878a6b71fbbc1dfbed3a66a90d4405ef0e9f31519868df6b69110777bac5"} Feb 24 03:11:59 crc kubenswrapper[4923]: I0224 03:11:59.440125 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=17.426584512 podStartE2EDuration="24.440108615s" podCreationTimestamp="2026-02-24 03:11:35 +0000 UTC" firstStartedPulling="2026-02-24 03:11:50.473152804 +0000 UTC m=+1034.490223617" lastFinishedPulling="2026-02-24 03:11:57.486676907 +0000 UTC m=+1041.503747720" observedRunningTime="2026-02-24 03:11:58.498488201 +0000 UTC m=+1042.515559014" watchObservedRunningTime="2026-02-24 03:11:59.440108615 +0000 UTC m=+1043.457179448" Feb 24 03:12:00 crc kubenswrapper[4923]: I0224 03:12:00.427930 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-555wh" event={"ID":"22ccde27-2e54-4d62-8cc4-8b12ea5e92a7","Type":"ContainerStarted","Data":"1dc563d3e0a48faa6fa1dbff710fac2b74cf5e453c4f9acce7d5602d3536a2b4"} Feb 24 03:12:00 crc kubenswrapper[4923]: I0224 03:12:00.428352 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:12:00 crc kubenswrapper[4923]: I0224 03:12:00.428381 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:12:00 crc kubenswrapper[4923]: I0224 03:12:00.430719 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"34d16b71-0cf5-4143-9225-3e44441dc2da","Type":"ContainerStarted","Data":"b80c442559b87a82408c0163b2b4233719201687506c8891ce51da325d2e8189"} Feb 24 03:12:00 crc kubenswrapper[4923]: I0224 03:12:00.432813 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" event={"ID":"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c","Type":"ContainerStarted","Data":"575de305a6df8381e2aebf0b70e2f7258fb4d7868f93f29534d40cb3796c9424"} Feb 24 03:12:00 crc kubenswrapper[4923]: I0224 03:12:00.433036 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" Feb 24 03:12:00 crc kubenswrapper[4923]: I0224 03:12:00.434974 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c976efe6-239a-4f24-a392-b1b5ba3545de","Type":"ContainerStarted","Data":"fb87b4b69165d8a268caad8e0944ad9d91927f5279b81db68f193c7fb14d1d65"} Feb 24 03:12:00 crc kubenswrapper[4923]: I0224 03:12:00.463660 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-555wh" podStartSLOduration=16.332678052 podStartE2EDuration="22.463642349s" podCreationTimestamp="2026-02-24 03:11:38 +0000 UTC" firstStartedPulling="2026-02-24 03:11:50.578207711 +0000 UTC m=+1034.595278514" lastFinishedPulling="2026-02-24 03:11:56.709171958 +0000 UTC m=+1040.726242811" observedRunningTime="2026-02-24 03:12:00.458002241 +0000 UTC m=+1044.475073084" watchObservedRunningTime="2026-02-24 03:12:00.463642349 +0000 UTC m=+1044.480713162" Feb 24 03:12:00 crc kubenswrapper[4923]: I0224 03:12:00.503086 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" podStartSLOduration=3.161778119 podStartE2EDuration="31.503063943s" podCreationTimestamp="2026-02-24 03:11:29 +0000 UTC" firstStartedPulling="2026-02-24 03:11:29.795358484 +0000 UTC m=+1013.812429297" lastFinishedPulling="2026-02-24 03:11:58.136644318 +0000 UTC m=+1042.153715121" observedRunningTime="2026-02-24 03:12:00.476054154 +0000 UTC m=+1044.493124967" watchObservedRunningTime="2026-02-24 03:12:00.503063943 +0000 UTC m=+1044.520134756" Feb 24 03:12:00 crc kubenswrapper[4923]: I0224 03:12:00.512355 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.806487476000001 podStartE2EDuration="20.512283115s" podCreationTimestamp="2026-02-24 03:11:40 +0000 UTC" firstStartedPulling="2026-02-24 03:11:50.582652007 +0000 UTC m=+1034.599722820" lastFinishedPulling="2026-02-24 03:11:59.288447646 +0000 UTC m=+1043.305518459" observedRunningTime="2026-02-24 03:12:00.503380271 +0000 UTC m=+1044.520451124" watchObservedRunningTime="2026-02-24 03:12:00.512283115 +0000 UTC m=+1044.529353968" Feb 24 03:12:00 crc kubenswrapper[4923]: I0224 03:12:00.533339 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=11.721350992 podStartE2EDuration="20.533321437s" podCreationTimestamp="2026-02-24 03:11:40 +0000 UTC" firstStartedPulling="2026-02-24 03:11:50.489142304 +0000 UTC m=+1034.506213127" lastFinishedPulling="2026-02-24 03:11:59.301112749 +0000 UTC m=+1043.318183572" observedRunningTime="2026-02-24 03:12:00.524188277 +0000 UTC m=+1044.541259100" watchObservedRunningTime="2026-02-24 03:12:00.533321437 +0000 UTC m=+1044.550392260" Feb 24 03:12:01 crc kubenswrapper[4923]: I0224 03:12:01.449112 4923 generic.go:334] "Generic (PLEG): container finished" podID="b2879b26-9173-4d23-b6f4-9c9e4c43f08e" containerID="61aadbc1158217678723884dbfc79b6ae19c1fd6746e91c690cf67d20f709d5f" exitCode=0 Feb 24 03:12:01 crc kubenswrapper[4923]: I0224 03:12:01.449261 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b2879b26-9173-4d23-b6f4-9c9e4c43f08e","Type":"ContainerDied","Data":"61aadbc1158217678723884dbfc79b6ae19c1fd6746e91c690cf67d20f709d5f"} Feb 24 03:12:01 crc kubenswrapper[4923]: I0224 03:12:01.453142 4923 generic.go:334] "Generic (PLEG): container finished" podID="71ebe37b-5831-4545-8f6a-8db6e194982f" containerID="0d2c9947f54f339626df65be074d58197bdfec8c9175334158e1d36a2fece671" exitCode=0 Feb 24 03:12:01 crc kubenswrapper[4923]: I0224 03:12:01.453406 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"71ebe37b-5831-4545-8f6a-8db6e194982f","Type":"ContainerDied","Data":"0d2c9947f54f339626df65be074d58197bdfec8c9175334158e1d36a2fece671"} Feb 24 03:12:01 crc kubenswrapper[4923]: I0224 03:12:01.509751 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 24 03:12:02 crc kubenswrapper[4923]: I0224 03:12:02.168821 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 24 03:12:02 crc kubenswrapper[4923]: I0224 03:12:02.467427 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b2879b26-9173-4d23-b6f4-9c9e4c43f08e","Type":"ContainerStarted","Data":"d938b62055fba86fd6026ff5d714fbb366d95c2ea3bff695d81ea324a8c3b759"} Feb 24 03:12:02 crc kubenswrapper[4923]: I0224 03:12:02.471254 4923 generic.go:334] "Generic (PLEG): container finished" podID="a4795fab-2fa1-4b70-b354-adf47ecf0575" containerID="eeedc68e667a462cf654ae933576f778164e8c6e21b505f315a0596105536647" exitCode=0 Feb 24 03:12:02 crc kubenswrapper[4923]: I0224 03:12:02.471321 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" event={"ID":"a4795fab-2fa1-4b70-b354-adf47ecf0575","Type":"ContainerDied","Data":"eeedc68e667a462cf654ae933576f778164e8c6e21b505f315a0596105536647"} Feb 24 03:12:02 crc kubenswrapper[4923]: I0224 03:12:02.475780 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"71ebe37b-5831-4545-8f6a-8db6e194982f","Type":"ContainerStarted","Data":"8885b3e3e5ac81fb9b197fe5b5ae26e2e3873fa3315f8e838989010a90cab506"} Feb 24 03:12:02 crc kubenswrapper[4923]: I0224 03:12:02.507627 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.030812855 podStartE2EDuration="31.507600912s" podCreationTimestamp="2026-02-24 03:11:31 +0000 UTC" firstStartedPulling="2026-02-24 03:11:47.233653715 +0000 UTC m=+1031.250724528" lastFinishedPulling="2026-02-24 03:11:56.710441742 +0000 UTC m=+1040.727512585" observedRunningTime="2026-02-24 03:12:02.500868106 +0000 UTC m=+1046.517938959" watchObservedRunningTime="2026-02-24 03:12:02.507600912 +0000 UTC m=+1046.524671765" Feb 24 03:12:02 crc kubenswrapper[4923]: I0224 03:12:02.509756 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 24 03:12:02 crc kubenswrapper[4923]: I0224 03:12:02.577227 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 24 03:12:02 crc kubenswrapper[4923]: I0224 03:12:02.609650 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=26.378140763 podStartE2EDuration="32.609632109s" podCreationTimestamp="2026-02-24 03:11:30 +0000 UTC" firstStartedPulling="2026-02-24 03:11:50.477381475 +0000 UTC m=+1034.494452288" lastFinishedPulling="2026-02-24 03:11:56.708872831 +0000 UTC m=+1040.725943634" observedRunningTime="2026-02-24 03:12:02.560404238 +0000 UTC m=+1046.577475071" watchObservedRunningTime="2026-02-24 03:12:02.609632109 +0000 UTC m=+1046.626702922" Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.034378 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.035699 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.169509 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.217066 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.345094 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.484091 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" event={"ID":"a4795fab-2fa1-4b70-b354-adf47ecf0575","Type":"ContainerStarted","Data":"72ffa6ef09732b2096acd1d63adb2377fccbb79f04dca42fbc818e5c8d3fdaef"} Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.504322 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" podStartSLOduration=-9223372001.35049 podStartE2EDuration="35.504284911s" podCreationTimestamp="2026-02-24 03:11:28 +0000 UTC" firstStartedPulling="2026-02-24 03:11:29.586814552 +0000 UTC m=+1013.603885365" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:12:03.503735927 +0000 UTC m=+1047.520806740" watchObservedRunningTime="2026-02-24 03:12:03.504284911 +0000 UTC m=+1047.521355724" Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.539288 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.548099 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.785091 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gjls5"] Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.785287 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" podUID="8fb6a340-692e-43b8-aa57-c7bb67f0ba7c" containerName="dnsmasq-dns" containerID="cri-o://575de305a6df8381e2aebf0b70e2f7258fb4d7868f93f29534d40cb3796c9424" gracePeriod=10 Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.820647 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-l8zpt"] Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.821802 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.824892 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.834515 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-l8zpt"] Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.855364 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-pnpxx"] Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.856345 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.859759 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.870764 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-pnpxx"] Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.908288 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba1685d9-4549-4187-a44c-32abe83b6890-config\") pod \"dnsmasq-dns-6bc7876d45-l8zpt\" (UID: \"ba1685d9-4549-4187-a44c-32abe83b6890\") " pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.908382 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba1685d9-4549-4187-a44c-32abe83b6890-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-l8zpt\" (UID: \"ba1685d9-4549-4187-a44c-32abe83b6890\") " pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.908415 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba1685d9-4549-4187-a44c-32abe83b6890-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-l8zpt\" (UID: \"ba1685d9-4549-4187-a44c-32abe83b6890\") " pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.908439 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frm9m\" (UniqueName: \"kubernetes.io/projected/ba1685d9-4549-4187-a44c-32abe83b6890-kube-api-access-frm9m\") pod \"dnsmasq-dns-6bc7876d45-l8zpt\" (UID: \"ba1685d9-4549-4187-a44c-32abe83b6890\") " pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.972217 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rhvwd"] Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.992553 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-dtx22"] Feb 24 03:12:03 crc kubenswrapper[4923]: I0224 03:12:03.993816 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:03.998535 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.010628 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/459e20ec-ab36-4745-9a6b-8c3832560d72-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pnpxx\" (UID: \"459e20ec-ab36-4745-9a6b-8c3832560d72\") " pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.010713 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/459e20ec-ab36-4745-9a6b-8c3832560d72-ovs-rundir\") pod \"ovn-controller-metrics-pnpxx\" (UID: \"459e20ec-ab36-4745-9a6b-8c3832560d72\") " pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.010752 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/459e20ec-ab36-4745-9a6b-8c3832560d72-config\") pod \"ovn-controller-metrics-pnpxx\" (UID: \"459e20ec-ab36-4745-9a6b-8c3832560d72\") " pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.010784 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/459e20ec-ab36-4745-9a6b-8c3832560d72-ovn-rundir\") pod \"ovn-controller-metrics-pnpxx\" (UID: \"459e20ec-ab36-4745-9a6b-8c3832560d72\") " pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.010875 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba1685d9-4549-4187-a44c-32abe83b6890-config\") pod \"dnsmasq-dns-6bc7876d45-l8zpt\" (UID: \"ba1685d9-4549-4187-a44c-32abe83b6890\") " pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.010917 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba1685d9-4549-4187-a44c-32abe83b6890-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-l8zpt\" (UID: \"ba1685d9-4549-4187-a44c-32abe83b6890\") " pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.010943 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459e20ec-ab36-4745-9a6b-8c3832560d72-combined-ca-bundle\") pod \"ovn-controller-metrics-pnpxx\" (UID: \"459e20ec-ab36-4745-9a6b-8c3832560d72\") " pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.010969 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm22n\" (UniqueName: \"kubernetes.io/projected/459e20ec-ab36-4745-9a6b-8c3832560d72-kube-api-access-wm22n\") pod \"ovn-controller-metrics-pnpxx\" (UID: \"459e20ec-ab36-4745-9a6b-8c3832560d72\") " pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.010996 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba1685d9-4549-4187-a44c-32abe83b6890-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-l8zpt\" (UID: \"ba1685d9-4549-4187-a44c-32abe83b6890\") " pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.011018 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frm9m\" (UniqueName: \"kubernetes.io/projected/ba1685d9-4549-4187-a44c-32abe83b6890-kube-api-access-frm9m\") pod \"dnsmasq-dns-6bc7876d45-l8zpt\" (UID: \"ba1685d9-4549-4187-a44c-32abe83b6890\") " pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.012132 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-dtx22"] Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.017224 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba1685d9-4549-4187-a44c-32abe83b6890-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-l8zpt\" (UID: \"ba1685d9-4549-4187-a44c-32abe83b6890\") " pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.017224 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba1685d9-4549-4187-a44c-32abe83b6890-config\") pod \"dnsmasq-dns-6bc7876d45-l8zpt\" (UID: \"ba1685d9-4549-4187-a44c-32abe83b6890\") " pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.017436 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba1685d9-4549-4187-a44c-32abe83b6890-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-l8zpt\" (UID: \"ba1685d9-4549-4187-a44c-32abe83b6890\") " pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.024948 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.033273 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.035848 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.035963 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.035983 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-km4pp" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.036169 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.037481 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frm9m\" (UniqueName: \"kubernetes.io/projected/ba1685d9-4549-4187-a44c-32abe83b6890-kube-api-access-frm9m\") pod \"dnsmasq-dns-6bc7876d45-l8zpt\" (UID: \"ba1685d9-4549-4187-a44c-32abe83b6890\") " pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.067903 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.098666 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.112824 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj62l\" (UniqueName: \"kubernetes.io/projected/e0d59d8f-593d-437e-9450-93fb5bbaa025-kube-api-access-kj62l\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.112884 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0d59d8f-593d-437e-9450-93fb5bbaa025-config\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.112912 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0d59d8f-593d-437e-9450-93fb5bbaa025-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.112965 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/459e20ec-ab36-4745-9a6b-8c3832560d72-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pnpxx\" (UID: \"459e20ec-ab36-4745-9a6b-8c3832560d72\") " pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.112991 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/459e20ec-ab36-4745-9a6b-8c3832560d72-ovs-rundir\") pod \"ovn-controller-metrics-pnpxx\" (UID: \"459e20ec-ab36-4745-9a6b-8c3832560d72\") " pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.113016 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-dns-svc\") pod \"dnsmasq-dns-8554648995-dtx22\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.113045 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/459e20ec-ab36-4745-9a6b-8c3832560d72-config\") pod \"ovn-controller-metrics-pnpxx\" (UID: \"459e20ec-ab36-4745-9a6b-8c3832560d72\") " pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.113071 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8cpg\" (UniqueName: \"kubernetes.io/projected/3728aab2-9679-48c9-8efa-fec1eba7bb89-kube-api-access-c8cpg\") pod \"dnsmasq-dns-8554648995-dtx22\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.113101 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/459e20ec-ab36-4745-9a6b-8c3832560d72-ovn-rundir\") pod \"ovn-controller-metrics-pnpxx\" (UID: \"459e20ec-ab36-4745-9a6b-8c3832560d72\") " pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.113409 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/459e20ec-ab36-4745-9a6b-8c3832560d72-ovs-rundir\") pod \"ovn-controller-metrics-pnpxx\" (UID: \"459e20ec-ab36-4745-9a6b-8c3832560d72\") " pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.113419 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/459e20ec-ab36-4745-9a6b-8c3832560d72-ovn-rundir\") pod \"ovn-controller-metrics-pnpxx\" (UID: \"459e20ec-ab36-4745-9a6b-8c3832560d72\") " pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.113626 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0d59d8f-593d-437e-9450-93fb5bbaa025-scripts\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.113653 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0d59d8f-593d-437e-9450-93fb5bbaa025-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.113728 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-dtx22\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.113860 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0d59d8f-593d-437e-9450-93fb5bbaa025-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.113906 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-config\") pod \"dnsmasq-dns-8554648995-dtx22\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.113943 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-dtx22\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.113975 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e0d59d8f-593d-437e-9450-93fb5bbaa025-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.114032 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459e20ec-ab36-4745-9a6b-8c3832560d72-combined-ca-bundle\") pod \"ovn-controller-metrics-pnpxx\" (UID: \"459e20ec-ab36-4745-9a6b-8c3832560d72\") " pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.114055 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm22n\" (UniqueName: \"kubernetes.io/projected/459e20ec-ab36-4745-9a6b-8c3832560d72-kube-api-access-wm22n\") pod \"ovn-controller-metrics-pnpxx\" (UID: \"459e20ec-ab36-4745-9a6b-8c3832560d72\") " pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.114105 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/459e20ec-ab36-4745-9a6b-8c3832560d72-config\") pod \"ovn-controller-metrics-pnpxx\" (UID: \"459e20ec-ab36-4745-9a6b-8c3832560d72\") " pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.117766 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459e20ec-ab36-4745-9a6b-8c3832560d72-combined-ca-bundle\") pod \"ovn-controller-metrics-pnpxx\" (UID: \"459e20ec-ab36-4745-9a6b-8c3832560d72\") " pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.118400 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/459e20ec-ab36-4745-9a6b-8c3832560d72-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pnpxx\" (UID: \"459e20ec-ab36-4745-9a6b-8c3832560d72\") " pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.140092 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.141322 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm22n\" (UniqueName: \"kubernetes.io/projected/459e20ec-ab36-4745-9a6b-8c3832560d72-kube-api-access-wm22n\") pod \"ovn-controller-metrics-pnpxx\" (UID: \"459e20ec-ab36-4745-9a6b-8c3832560d72\") " pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.215277 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-dns-svc\") pod \"dnsmasq-dns-8554648995-dtx22\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.215345 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8cpg\" (UniqueName: \"kubernetes.io/projected/3728aab2-9679-48c9-8efa-fec1eba7bb89-kube-api-access-c8cpg\") pod \"dnsmasq-dns-8554648995-dtx22\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.215381 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0d59d8f-593d-437e-9450-93fb5bbaa025-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.215398 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0d59d8f-593d-437e-9450-93fb5bbaa025-scripts\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.215433 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-dtx22\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.215473 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0d59d8f-593d-437e-9450-93fb5bbaa025-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.215489 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-config\") pod \"dnsmasq-dns-8554648995-dtx22\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.215506 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e0d59d8f-593d-437e-9450-93fb5bbaa025-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.215524 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-dtx22\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.215566 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj62l\" (UniqueName: \"kubernetes.io/projected/e0d59d8f-593d-437e-9450-93fb5bbaa025-kube-api-access-kj62l\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.215590 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0d59d8f-593d-437e-9450-93fb5bbaa025-config\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.215609 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0d59d8f-593d-437e-9450-93fb5bbaa025-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.216768 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0d59d8f-593d-437e-9450-93fb5bbaa025-config\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.216799 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0d59d8f-593d-437e-9450-93fb5bbaa025-scripts\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.217002 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-dtx22\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.217024 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-dtx22\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.217043 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-config\") pod \"dnsmasq-dns-8554648995-dtx22\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.217648 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-dns-svc\") pod \"dnsmasq-dns-8554648995-dtx22\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.217997 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e0d59d8f-593d-437e-9450-93fb5bbaa025-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.223914 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0d59d8f-593d-437e-9450-93fb5bbaa025-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.226450 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0d59d8f-593d-437e-9450-93fb5bbaa025-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.226956 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0d59d8f-593d-437e-9450-93fb5bbaa025-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.235951 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj62l\" (UniqueName: \"kubernetes.io/projected/e0d59d8f-593d-437e-9450-93fb5bbaa025-kube-api-access-kj62l\") pod \"ovn-northd-0\" (UID: \"e0d59d8f-593d-437e-9450-93fb5bbaa025\") " pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.237938 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8cpg\" (UniqueName: \"kubernetes.io/projected/3728aab2-9679-48c9-8efa-fec1eba7bb89-kube-api-access-c8cpg\") pod \"dnsmasq-dns-8554648995-dtx22\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.243769 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-pnpxx" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.261938 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.316217 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c-config\") pod \"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c\" (UID: \"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c\") " Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.316429 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c-dns-svc\") pod \"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c\" (UID: \"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c\") " Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.316477 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djlss\" (UniqueName: \"kubernetes.io/projected/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c-kube-api-access-djlss\") pod \"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c\" (UID: \"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c\") " Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.323998 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c-kube-api-access-djlss" (OuterVolumeSpecName: "kube-api-access-djlss") pod "8fb6a340-692e-43b8-aa57-c7bb67f0ba7c" (UID: "8fb6a340-692e-43b8-aa57-c7bb67f0ba7c"). InnerVolumeSpecName "kube-api-access-djlss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.332704 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.357963 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8fb6a340-692e-43b8-aa57-c7bb67f0ba7c" (UID: "8fb6a340-692e-43b8-aa57-c7bb67f0ba7c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.372013 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c-config" (OuterVolumeSpecName: "config") pod "8fb6a340-692e-43b8-aa57-c7bb67f0ba7c" (UID: "8fb6a340-692e-43b8-aa57-c7bb67f0ba7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.384199 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.418799 4923 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.418830 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djlss\" (UniqueName: \"kubernetes.io/projected/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c-kube-api-access-djlss\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.418842 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.501672 4923 generic.go:334] "Generic (PLEG): container finished" podID="8fb6a340-692e-43b8-aa57-c7bb67f0ba7c" containerID="575de305a6df8381e2aebf0b70e2f7258fb4d7868f93f29534d40cb3796c9424" exitCode=0 Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.502937 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.503941 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" event={"ID":"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c","Type":"ContainerDied","Data":"575de305a6df8381e2aebf0b70e2f7258fb4d7868f93f29534d40cb3796c9424"} Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.503991 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gjls5" event={"ID":"8fb6a340-692e-43b8-aa57-c7bb67f0ba7c","Type":"ContainerDied","Data":"ec6673eba3cd6ef33121c0d19b5d55790da08557bdbf61f6d13b251fe1d6356f"} Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.504013 4923 scope.go:117] "RemoveContainer" containerID="575de305a6df8381e2aebf0b70e2f7258fb4d7868f93f29534d40cb3796c9424" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.504030 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" podUID="a4795fab-2fa1-4b70-b354-adf47ecf0575" containerName="dnsmasq-dns" containerID="cri-o://72ffa6ef09732b2096acd1d63adb2377fccbb79f04dca42fbc818e5c8d3fdaef" gracePeriod=10 Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.538902 4923 scope.go:117] "RemoveContainer" containerID="cb66878a6b71fbbc1dfbed3a66a90d4405ef0e9f31519868df6b69110777bac5" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.541275 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gjls5"] Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.559168 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gjls5"] Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.594737 4923 scope.go:117] "RemoveContainer" containerID="575de305a6df8381e2aebf0b70e2f7258fb4d7868f93f29534d40cb3796c9424" Feb 24 03:12:04 crc kubenswrapper[4923]: E0224 03:12:04.595241 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"575de305a6df8381e2aebf0b70e2f7258fb4d7868f93f29534d40cb3796c9424\": container with ID starting with 575de305a6df8381e2aebf0b70e2f7258fb4d7868f93f29534d40cb3796c9424 not found: ID does not exist" containerID="575de305a6df8381e2aebf0b70e2f7258fb4d7868f93f29534d40cb3796c9424" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.595315 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"575de305a6df8381e2aebf0b70e2f7258fb4d7868f93f29534d40cb3796c9424"} err="failed to get container status \"575de305a6df8381e2aebf0b70e2f7258fb4d7868f93f29534d40cb3796c9424\": rpc error: code = NotFound desc = could not find container \"575de305a6df8381e2aebf0b70e2f7258fb4d7868f93f29534d40cb3796c9424\": container with ID starting with 575de305a6df8381e2aebf0b70e2f7258fb4d7868f93f29534d40cb3796c9424 not found: ID does not exist" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.595359 4923 scope.go:117] "RemoveContainer" containerID="cb66878a6b71fbbc1dfbed3a66a90d4405ef0e9f31519868df6b69110777bac5" Feb 24 03:12:04 crc kubenswrapper[4923]: E0224 03:12:04.596688 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb66878a6b71fbbc1dfbed3a66a90d4405ef0e9f31519868df6b69110777bac5\": container with ID starting with cb66878a6b71fbbc1dfbed3a66a90d4405ef0e9f31519868df6b69110777bac5 not found: ID does not exist" containerID="cb66878a6b71fbbc1dfbed3a66a90d4405ef0e9f31519868df6b69110777bac5" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.596748 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb66878a6b71fbbc1dfbed3a66a90d4405ef0e9f31519868df6b69110777bac5"} err="failed to get container status \"cb66878a6b71fbbc1dfbed3a66a90d4405ef0e9f31519868df6b69110777bac5\": rpc error: code = NotFound desc = could not find container \"cb66878a6b71fbbc1dfbed3a66a90d4405ef0e9f31519868df6b69110777bac5\": container with ID starting with cb66878a6b71fbbc1dfbed3a66a90d4405ef0e9f31519868df6b69110777bac5 not found: ID does not exist" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.603496 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-l8zpt"] Feb 24 03:12:04 crc kubenswrapper[4923]: W0224 03:12:04.605648 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba1685d9_4549_4187_a44c_32abe83b6890.slice/crio-7599923bf631fb115a0d51e8e08b6f9d638ca4b7d0e294b3cc9b9bb38ebf94af WatchSource:0}: Error finding container 7599923bf631fb115a0d51e8e08b6f9d638ca4b7d0e294b3cc9b9bb38ebf94af: Status 404 returned error can't find the container with id 7599923bf631fb115a0d51e8e08b6f9d638ca4b7d0e294b3cc9b9bb38ebf94af Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.679851 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-pnpxx"] Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.818498 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-dtx22"] Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.876355 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.926283 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4795fab-2fa1-4b70-b354-adf47ecf0575-dns-svc\") pod \"a4795fab-2fa1-4b70-b354-adf47ecf0575\" (UID: \"a4795fab-2fa1-4b70-b354-adf47ecf0575\") " Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.926369 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkw8d\" (UniqueName: \"kubernetes.io/projected/a4795fab-2fa1-4b70-b354-adf47ecf0575-kube-api-access-tkw8d\") pod \"a4795fab-2fa1-4b70-b354-adf47ecf0575\" (UID: \"a4795fab-2fa1-4b70-b354-adf47ecf0575\") " Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.926431 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4795fab-2fa1-4b70-b354-adf47ecf0575-config\") pod \"a4795fab-2fa1-4b70-b354-adf47ecf0575\" (UID: \"a4795fab-2fa1-4b70-b354-adf47ecf0575\") " Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.940156 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.946519 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4795fab-2fa1-4b70-b354-adf47ecf0575-kube-api-access-tkw8d" (OuterVolumeSpecName: "kube-api-access-tkw8d") pod "a4795fab-2fa1-4b70-b354-adf47ecf0575" (UID: "a4795fab-2fa1-4b70-b354-adf47ecf0575"). InnerVolumeSpecName "kube-api-access-tkw8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.976888 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4795fab-2fa1-4b70-b354-adf47ecf0575-config" (OuterVolumeSpecName: "config") pod "a4795fab-2fa1-4b70-b354-adf47ecf0575" (UID: "a4795fab-2fa1-4b70-b354-adf47ecf0575"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:04 crc kubenswrapper[4923]: I0224 03:12:04.996340 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4795fab-2fa1-4b70-b354-adf47ecf0575-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a4795fab-2fa1-4b70-b354-adf47ecf0575" (UID: "a4795fab-2fa1-4b70-b354-adf47ecf0575"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.028384 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkw8d\" (UniqueName: \"kubernetes.io/projected/a4795fab-2fa1-4b70-b354-adf47ecf0575-kube-api-access-tkw8d\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.028415 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4795fab-2fa1-4b70-b354-adf47ecf0575-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.028425 4923 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4795fab-2fa1-4b70-b354-adf47ecf0575-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.511746 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e0d59d8f-593d-437e-9450-93fb5bbaa025","Type":"ContainerStarted","Data":"fa7f37a3c019f95127301505516efb15f465e4ff86bf56a72ee12fbdddf079fc"} Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.513444 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-pnpxx" event={"ID":"459e20ec-ab36-4745-9a6b-8c3832560d72","Type":"ContainerStarted","Data":"a972f1f47cac234555d9742a3ded28a46c96edc4bf194a829fd8cfc9e57a69e4"} Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.513519 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-pnpxx" event={"ID":"459e20ec-ab36-4745-9a6b-8c3832560d72","Type":"ContainerStarted","Data":"23fb9098d2796b3d48f90e8ad28cde1f9b9219e2891a4ea6d9eee39a4c037d56"} Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.515929 4923 generic.go:334] "Generic (PLEG): container finished" podID="a4795fab-2fa1-4b70-b354-adf47ecf0575" containerID="72ffa6ef09732b2096acd1d63adb2377fccbb79f04dca42fbc818e5c8d3fdaef" exitCode=0 Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.515970 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.516029 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" event={"ID":"a4795fab-2fa1-4b70-b354-adf47ecf0575","Type":"ContainerDied","Data":"72ffa6ef09732b2096acd1d63adb2377fccbb79f04dca42fbc818e5c8d3fdaef"} Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.516091 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-rhvwd" event={"ID":"a4795fab-2fa1-4b70-b354-adf47ecf0575","Type":"ContainerDied","Data":"73b2fd4f9c75eb5a55ee14db33a4098344e9d3f86f120bb6bac64789363eca06"} Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.516115 4923 scope.go:117] "RemoveContainer" containerID="72ffa6ef09732b2096acd1d63adb2377fccbb79f04dca42fbc818e5c8d3fdaef" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.519172 4923 generic.go:334] "Generic (PLEG): container finished" podID="ba1685d9-4549-4187-a44c-32abe83b6890" containerID="4cc35b8dc69802ab682abf1eb98120645314af9467147d7f0fd476f245909f16" exitCode=0 Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.519236 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" event={"ID":"ba1685d9-4549-4187-a44c-32abe83b6890","Type":"ContainerDied","Data":"4cc35b8dc69802ab682abf1eb98120645314af9467147d7f0fd476f245909f16"} Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.519261 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" event={"ID":"ba1685d9-4549-4187-a44c-32abe83b6890","Type":"ContainerStarted","Data":"7599923bf631fb115a0d51e8e08b6f9d638ca4b7d0e294b3cc9b9bb38ebf94af"} Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.529710 4923 generic.go:334] "Generic (PLEG): container finished" podID="3728aab2-9679-48c9-8efa-fec1eba7bb89" containerID="8920583344df99cb575a7f3bd3851daa40f98f2460ab6fd5d8af8550d189c373" exitCode=0 Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.529907 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-dtx22" event={"ID":"3728aab2-9679-48c9-8efa-fec1eba7bb89","Type":"ContainerDied","Data":"8920583344df99cb575a7f3bd3851daa40f98f2460ab6fd5d8af8550d189c373"} Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.529952 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-dtx22" event={"ID":"3728aab2-9679-48c9-8efa-fec1eba7bb89","Type":"ContainerStarted","Data":"de788a7e95256cde5cc76a5cdb29faa471f99bcc88fc6c2f19da178c43a72e2d"} Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.546972 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-pnpxx" podStartSLOduration=2.546942681 podStartE2EDuration="2.546942681s" podCreationTimestamp="2026-02-24 03:12:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:12:05.536176288 +0000 UTC m=+1049.553247111" watchObservedRunningTime="2026-02-24 03:12:05.546942681 +0000 UTC m=+1049.564013494" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.551662 4923 scope.go:117] "RemoveContainer" containerID="eeedc68e667a462cf654ae933576f778164e8c6e21b505f315a0596105536647" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.625948 4923 scope.go:117] "RemoveContainer" containerID="72ffa6ef09732b2096acd1d63adb2377fccbb79f04dca42fbc818e5c8d3fdaef" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.626052 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rhvwd"] Feb 24 03:12:05 crc kubenswrapper[4923]: E0224 03:12:05.626398 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72ffa6ef09732b2096acd1d63adb2377fccbb79f04dca42fbc818e5c8d3fdaef\": container with ID starting with 72ffa6ef09732b2096acd1d63adb2377fccbb79f04dca42fbc818e5c8d3fdaef not found: ID does not exist" containerID="72ffa6ef09732b2096acd1d63adb2377fccbb79f04dca42fbc818e5c8d3fdaef" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.626433 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72ffa6ef09732b2096acd1d63adb2377fccbb79f04dca42fbc818e5c8d3fdaef"} err="failed to get container status \"72ffa6ef09732b2096acd1d63adb2377fccbb79f04dca42fbc818e5c8d3fdaef\": rpc error: code = NotFound desc = could not find container \"72ffa6ef09732b2096acd1d63adb2377fccbb79f04dca42fbc818e5c8d3fdaef\": container with ID starting with 72ffa6ef09732b2096acd1d63adb2377fccbb79f04dca42fbc818e5c8d3fdaef not found: ID does not exist" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.626460 4923 scope.go:117] "RemoveContainer" containerID="eeedc68e667a462cf654ae933576f778164e8c6e21b505f315a0596105536647" Feb 24 03:12:05 crc kubenswrapper[4923]: E0224 03:12:05.626817 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeedc68e667a462cf654ae933576f778164e8c6e21b505f315a0596105536647\": container with ID starting with eeedc68e667a462cf654ae933576f778164e8c6e21b505f315a0596105536647 not found: ID does not exist" containerID="eeedc68e667a462cf654ae933576f778164e8c6e21b505f315a0596105536647" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.626861 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeedc68e667a462cf654ae933576f778164e8c6e21b505f315a0596105536647"} err="failed to get container status \"eeedc68e667a462cf654ae933576f778164e8c6e21b505f315a0596105536647\": rpc error: code = NotFound desc = could not find container \"eeedc68e667a462cf654ae933576f778164e8c6e21b505f315a0596105536647\": container with ID starting with eeedc68e667a462cf654ae933576f778164e8c6e21b505f315a0596105536647 not found: ID does not exist" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.636657 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-rhvwd"] Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.737064 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fb6a340-692e-43b8-aa57-c7bb67f0ba7c" path="/var/lib/kubelet/pods/8fb6a340-692e-43b8-aa57-c7bb67f0ba7c/volumes" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.769345 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4795fab-2fa1-4b70-b354-adf47ecf0575" path="/var/lib/kubelet/pods/a4795fab-2fa1-4b70-b354-adf47ecf0575/volumes" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.793578 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.808126 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-l8zpt"] Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.883388 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-xdptp"] Feb 24 03:12:05 crc kubenswrapper[4923]: E0224 03:12:05.883866 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4795fab-2fa1-4b70-b354-adf47ecf0575" containerName="dnsmasq-dns" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.883879 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4795fab-2fa1-4b70-b354-adf47ecf0575" containerName="dnsmasq-dns" Feb 24 03:12:05 crc kubenswrapper[4923]: E0224 03:12:05.883913 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4795fab-2fa1-4b70-b354-adf47ecf0575" containerName="init" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.883919 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4795fab-2fa1-4b70-b354-adf47ecf0575" containerName="init" Feb 24 03:12:05 crc kubenswrapper[4923]: E0224 03:12:05.883929 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb6a340-692e-43b8-aa57-c7bb67f0ba7c" containerName="dnsmasq-dns" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.883935 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb6a340-692e-43b8-aa57-c7bb67f0ba7c" containerName="dnsmasq-dns" Feb 24 03:12:05 crc kubenswrapper[4923]: E0224 03:12:05.883964 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb6a340-692e-43b8-aa57-c7bb67f0ba7c" containerName="init" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.883971 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb6a340-692e-43b8-aa57-c7bb67f0ba7c" containerName="init" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.884104 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fb6a340-692e-43b8-aa57-c7bb67f0ba7c" containerName="dnsmasq-dns" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.884116 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4795fab-2fa1-4b70-b354-adf47ecf0575" containerName="dnsmasq-dns" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.885007 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.897101 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-xdptp"] Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.964056 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-xdptp\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.964456 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-xdptp\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.964518 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-xdptp\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.964558 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rst7l\" (UniqueName: \"kubernetes.io/projected/def9fcfc-1366-4652-a4c7-aeec946c3d96-kube-api-access-rst7l\") pod \"dnsmasq-dns-b8fbc5445-xdptp\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:05 crc kubenswrapper[4923]: I0224 03:12:05.964614 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-config\") pod \"dnsmasq-dns-b8fbc5445-xdptp\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:06 crc kubenswrapper[4923]: I0224 03:12:06.066016 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-config\") pod \"dnsmasq-dns-b8fbc5445-xdptp\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:06 crc kubenswrapper[4923]: I0224 03:12:06.066085 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-xdptp\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:06 crc kubenswrapper[4923]: I0224 03:12:06.066108 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-xdptp\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:06 crc kubenswrapper[4923]: I0224 03:12:06.066159 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-xdptp\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:06 crc kubenswrapper[4923]: I0224 03:12:06.066194 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rst7l\" (UniqueName: \"kubernetes.io/projected/def9fcfc-1366-4652-a4c7-aeec946c3d96-kube-api-access-rst7l\") pod \"dnsmasq-dns-b8fbc5445-xdptp\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:06 crc kubenswrapper[4923]: I0224 03:12:06.067235 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-config\") pod \"dnsmasq-dns-b8fbc5445-xdptp\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:06 crc kubenswrapper[4923]: I0224 03:12:06.068569 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-xdptp\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:06 crc kubenswrapper[4923]: I0224 03:12:06.068566 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-xdptp\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:06 crc kubenswrapper[4923]: I0224 03:12:06.069240 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-xdptp\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:06 crc kubenswrapper[4923]: I0224 03:12:06.090597 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rst7l\" (UniqueName: \"kubernetes.io/projected/def9fcfc-1366-4652-a4c7-aeec946c3d96-kube-api-access-rst7l\") pod \"dnsmasq-dns-b8fbc5445-xdptp\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:06 crc kubenswrapper[4923]: I0224 03:12:06.295712 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:06 crc kubenswrapper[4923]: I0224 03:12:06.540541 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" event={"ID":"ba1685d9-4549-4187-a44c-32abe83b6890","Type":"ContainerStarted","Data":"ecb1e57806c644100f270e5f8b28b36dffb8d222d6be228341b9d0d69752c970"} Feb 24 03:12:06 crc kubenswrapper[4923]: I0224 03:12:06.540654 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" podUID="ba1685d9-4549-4187-a44c-32abe83b6890" containerName="dnsmasq-dns" containerID="cri-o://ecb1e57806c644100f270e5f8b28b36dffb8d222d6be228341b9d0d69752c970" gracePeriod=10 Feb 24 03:12:06 crc kubenswrapper[4923]: I0224 03:12:06.540687 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" Feb 24 03:12:06 crc kubenswrapper[4923]: I0224 03:12:06.543393 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-dtx22" event={"ID":"3728aab2-9679-48c9-8efa-fec1eba7bb89","Type":"ContainerStarted","Data":"d8acbd6c41d8dc2e05d101ba696c2819a26de02861b170a1a78d9a8475d4a3e7"} Feb 24 03:12:06 crc kubenswrapper[4923]: I0224 03:12:06.544036 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:06 crc kubenswrapper[4923]: I0224 03:12:06.593223 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" podStartSLOduration=3.59319106 podStartE2EDuration="3.59319106s" podCreationTimestamp="2026-02-24 03:12:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:12:06.564797315 +0000 UTC m=+1050.581868128" watchObservedRunningTime="2026-02-24 03:12:06.59319106 +0000 UTC m=+1050.610261873" Feb 24 03:12:06 crc kubenswrapper[4923]: I0224 03:12:06.593489 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-dtx22" podStartSLOduration=3.593484317 podStartE2EDuration="3.593484317s" podCreationTimestamp="2026-02-24 03:12:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:12:06.588578189 +0000 UTC m=+1050.605649002" watchObservedRunningTime="2026-02-24 03:12:06.593484317 +0000 UTC m=+1050.610555130" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.000197 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-xdptp"] Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.050976 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.056166 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.058519 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.058521 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.063986 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.068111 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-gsdc9" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.070073 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.087204 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.182420 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba1685d9-4549-4187-a44c-32abe83b6890-ovsdbserver-sb\") pod \"ba1685d9-4549-4187-a44c-32abe83b6890\" (UID: \"ba1685d9-4549-4187-a44c-32abe83b6890\") " Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.182565 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba1685d9-4549-4187-a44c-32abe83b6890-dns-svc\") pod \"ba1685d9-4549-4187-a44c-32abe83b6890\" (UID: \"ba1685d9-4549-4187-a44c-32abe83b6890\") " Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.182795 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba1685d9-4549-4187-a44c-32abe83b6890-config\") pod \"ba1685d9-4549-4187-a44c-32abe83b6890\" (UID: \"ba1685d9-4549-4187-a44c-32abe83b6890\") " Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.182854 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frm9m\" (UniqueName: \"kubernetes.io/projected/ba1685d9-4549-4187-a44c-32abe83b6890-kube-api-access-frm9m\") pod \"ba1685d9-4549-4187-a44c-32abe83b6890\" (UID: \"ba1685d9-4549-4187-a44c-32abe83b6890\") " Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.183091 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-cache\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.183122 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.183162 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.183195 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc4w6\" (UniqueName: \"kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-kube-api-access-xc4w6\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.183222 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-lock\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.183268 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-etc-swift\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.196482 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba1685d9-4549-4187-a44c-32abe83b6890-kube-api-access-frm9m" (OuterVolumeSpecName: "kube-api-access-frm9m") pod "ba1685d9-4549-4187-a44c-32abe83b6890" (UID: "ba1685d9-4549-4187-a44c-32abe83b6890"). InnerVolumeSpecName "kube-api-access-frm9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.249420 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba1685d9-4549-4187-a44c-32abe83b6890-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba1685d9-4549-4187-a44c-32abe83b6890" (UID: "ba1685d9-4549-4187-a44c-32abe83b6890"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.253991 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba1685d9-4549-4187-a44c-32abe83b6890-config" (OuterVolumeSpecName: "config") pod "ba1685d9-4549-4187-a44c-32abe83b6890" (UID: "ba1685d9-4549-4187-a44c-32abe83b6890"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.254247 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba1685d9-4549-4187-a44c-32abe83b6890-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ba1685d9-4549-4187-a44c-32abe83b6890" (UID: "ba1685d9-4549-4187-a44c-32abe83b6890"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.284469 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-cache\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.284517 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.284547 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.284581 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc4w6\" (UniqueName: \"kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-kube-api-access-xc4w6\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.284614 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-lock\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.284647 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-etc-swift\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.284726 4923 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba1685d9-4549-4187-a44c-32abe83b6890-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.284738 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba1685d9-4549-4187-a44c-32abe83b6890-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.284747 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frm9m\" (UniqueName: \"kubernetes.io/projected/ba1685d9-4549-4187-a44c-32abe83b6890-kube-api-access-frm9m\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.284758 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba1685d9-4549-4187-a44c-32abe83b6890-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:07 crc kubenswrapper[4923]: E0224 03:12:07.284865 4923 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 24 03:12:07 crc kubenswrapper[4923]: E0224 03:12:07.284878 4923 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 24 03:12:07 crc kubenswrapper[4923]: E0224 03:12:07.284923 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-etc-swift podName:d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f nodeName:}" failed. No retries permitted until 2026-02-24 03:12:07.784906807 +0000 UTC m=+1051.801977620 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-etc-swift") pod "swift-storage-0" (UID: "d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f") : configmap "swift-ring-files" not found Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.285420 4923 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.285522 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-lock\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.288117 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-cache\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.292837 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.306451 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc4w6\" (UniqueName: \"kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-kube-api-access-xc4w6\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.306752 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.555140 4923 generic.go:334] "Generic (PLEG): container finished" podID="ba1685d9-4549-4187-a44c-32abe83b6890" containerID="ecb1e57806c644100f270e5f8b28b36dffb8d222d6be228341b9d0d69752c970" exitCode=0 Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.555222 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.555228 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" event={"ID":"ba1685d9-4549-4187-a44c-32abe83b6890","Type":"ContainerDied","Data":"ecb1e57806c644100f270e5f8b28b36dffb8d222d6be228341b9d0d69752c970"} Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.555596 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-fcvhg"] Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.555658 4923 scope.go:117] "RemoveContainer" containerID="ecb1e57806c644100f270e5f8b28b36dffb8d222d6be228341b9d0d69752c970" Feb 24 03:12:07 crc kubenswrapper[4923]: E0224 03:12:07.555938 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba1685d9-4549-4187-a44c-32abe83b6890" containerName="dnsmasq-dns" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.555967 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba1685d9-4549-4187-a44c-32abe83b6890" containerName="dnsmasq-dns" Feb 24 03:12:07 crc kubenswrapper[4923]: E0224 03:12:07.555995 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba1685d9-4549-4187-a44c-32abe83b6890" containerName="init" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.556022 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba1685d9-4549-4187-a44c-32abe83b6890" containerName="init" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.556177 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba1685d9-4549-4187-a44c-32abe83b6890" containerName="dnsmasq-dns" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.556653 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-l8zpt" event={"ID":"ba1685d9-4549-4187-a44c-32abe83b6890","Type":"ContainerDied","Data":"7599923bf631fb115a0d51e8e08b6f9d638ca4b7d0e294b3cc9b9bb38ebf94af"} Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.556767 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.558058 4923 generic.go:334] "Generic (PLEG): container finished" podID="def9fcfc-1366-4652-a4c7-aeec946c3d96" containerID="90aa696378c45d2fa75d78100316e113dffe4e97ce18a3c0a6c3593deb8ea805" exitCode=0 Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.558120 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" event={"ID":"def9fcfc-1366-4652-a4c7-aeec946c3d96","Type":"ContainerDied","Data":"90aa696378c45d2fa75d78100316e113dffe4e97ce18a3c0a6c3593deb8ea805"} Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.558378 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" event={"ID":"def9fcfc-1366-4652-a4c7-aeec946c3d96","Type":"ContainerStarted","Data":"8ac23df67b5af937f79a19b81c22be0007ac41c1ae5b584b5f3929611f7509bc"} Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.558816 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.558972 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.565696 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.572590 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e0d59d8f-593d-437e-9450-93fb5bbaa025","Type":"ContainerStarted","Data":"73f4ebec7b976bc9b058865ca41c449c9d8a81c446ff1cef7aa88cf253ecb335"} Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.572649 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e0d59d8f-593d-437e-9450-93fb5bbaa025","Type":"ContainerStarted","Data":"d73b89e6f8653a058895c9f1c8f1e9e5f5b82c3fb5542d7d7858563ad44183d0"} Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.572856 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.590558 4923 scope.go:117] "RemoveContainer" containerID="4cc35b8dc69802ab682abf1eb98120645314af9467147d7f0fd476f245909f16" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.600197 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-fcvhg"] Feb 24 03:12:07 crc kubenswrapper[4923]: E0224 03:12:07.600974 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-xqvdh ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-xqvdh ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-fcvhg" podUID="c7b2b129-bd9d-4f56-8ac1-796172c3ac8b" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.616856 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-84w4c"] Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.618049 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.633556 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-fcvhg"] Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.643797 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-84w4c"] Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.693836 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-scripts\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.694051 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-dispersionconf\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.694180 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-dispersionconf\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.694220 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqvdh\" (UniqueName: \"kubernetes.io/projected/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-kube-api-access-xqvdh\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.694337 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgxk8\" (UniqueName: \"kubernetes.io/projected/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-kube-api-access-kgxk8\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.694750 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-etc-swift\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.694907 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-swiftconf\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.694996 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-combined-ca-bundle\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.695040 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-combined-ca-bundle\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.695201 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-ring-data-devices\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.695269 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-scripts\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.695484 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-ring-data-devices\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.695521 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-swiftconf\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.695577 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-etc-swift\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.700699 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-l8zpt"] Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.705891 4923 scope.go:117] "RemoveContainer" containerID="ecb1e57806c644100f270e5f8b28b36dffb8d222d6be228341b9d0d69752c970" Feb 24 03:12:07 crc kubenswrapper[4923]: E0224 03:12:07.706542 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecb1e57806c644100f270e5f8b28b36dffb8d222d6be228341b9d0d69752c970\": container with ID starting with ecb1e57806c644100f270e5f8b28b36dffb8d222d6be228341b9d0d69752c970 not found: ID does not exist" containerID="ecb1e57806c644100f270e5f8b28b36dffb8d222d6be228341b9d0d69752c970" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.706581 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecb1e57806c644100f270e5f8b28b36dffb8d222d6be228341b9d0d69752c970"} err="failed to get container status \"ecb1e57806c644100f270e5f8b28b36dffb8d222d6be228341b9d0d69752c970\": rpc error: code = NotFound desc = could not find container \"ecb1e57806c644100f270e5f8b28b36dffb8d222d6be228341b9d0d69752c970\": container with ID starting with ecb1e57806c644100f270e5f8b28b36dffb8d222d6be228341b9d0d69752c970 not found: ID does not exist" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.706606 4923 scope.go:117] "RemoveContainer" containerID="4cc35b8dc69802ab682abf1eb98120645314af9467147d7f0fd476f245909f16" Feb 24 03:12:07 crc kubenswrapper[4923]: E0224 03:12:07.706831 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cc35b8dc69802ab682abf1eb98120645314af9467147d7f0fd476f245909f16\": container with ID starting with 4cc35b8dc69802ab682abf1eb98120645314af9467147d7f0fd476f245909f16 not found: ID does not exist" containerID="4cc35b8dc69802ab682abf1eb98120645314af9467147d7f0fd476f245909f16" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.706887 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cc35b8dc69802ab682abf1eb98120645314af9467147d7f0fd476f245909f16"} err="failed to get container status \"4cc35b8dc69802ab682abf1eb98120645314af9467147d7f0fd476f245909f16\": rpc error: code = NotFound desc = could not find container \"4cc35b8dc69802ab682abf1eb98120645314af9467147d7f0fd476f245909f16\": container with ID starting with 4cc35b8dc69802ab682abf1eb98120645314af9467147d7f0fd476f245909f16 not found: ID does not exist" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.748645 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-l8zpt"] Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.797243 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-dispersionconf\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.800541 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqvdh\" (UniqueName: \"kubernetes.io/projected/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-kube-api-access-xqvdh\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.800737 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgxk8\" (UniqueName: \"kubernetes.io/projected/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-kube-api-access-kgxk8\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.800879 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-etc-swift\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.801007 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-swiftconf\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.801139 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-combined-ca-bundle\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.801232 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-combined-ca-bundle\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.801419 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-ring-data-devices\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.801499 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-scripts\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.801577 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-ring-data-devices\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.801681 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-swiftconf\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.801792 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-etc-swift\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.801914 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-scripts\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.802051 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-dispersionconf\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.802129 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-etc-swift\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:07 crc kubenswrapper[4923]: E0224 03:12:07.802356 4923 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 24 03:12:07 crc kubenswrapper[4923]: E0224 03:12:07.802429 4923 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 24 03:12:07 crc kubenswrapper[4923]: E0224 03:12:07.802558 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-etc-swift podName:d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f nodeName:}" failed. No retries permitted until 2026-02-24 03:12:08.802540658 +0000 UTC m=+1052.819611481 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-etc-swift") pod "swift-storage-0" (UID: "d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f") : configmap "swift-ring-files" not found Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.802805 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-etc-swift\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.803417 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-ring-data-devices\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.803535 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-scripts\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.804221 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-ring-data-devices\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.804853 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-etc-swift\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.804945 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-scripts\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.810133 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-swiftconf\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.811540 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-dispersionconf\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.812504 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-combined-ca-bundle\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.812614 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-combined-ca-bundle\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.814092 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-swiftconf\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.817287 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-dispersionconf\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.823765 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgxk8\" (UniqueName: \"kubernetes.io/projected/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-kube-api-access-kgxk8\") pod \"swift-ring-rebalance-84w4c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.829984 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqvdh\" (UniqueName: \"kubernetes.io/projected/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-kube-api-access-xqvdh\") pod \"swift-ring-rebalance-fcvhg\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:07 crc kubenswrapper[4923]: I0224 03:12:07.954350 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.394821 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.761579879 podStartE2EDuration="5.394801506s" podCreationTimestamp="2026-02-24 03:12:03 +0000 UTC" firstStartedPulling="2026-02-24 03:12:04.973870627 +0000 UTC m=+1048.990941440" lastFinishedPulling="2026-02-24 03:12:06.607092234 +0000 UTC m=+1050.624163067" observedRunningTime="2026-02-24 03:12:07.732053318 +0000 UTC m=+1051.749124161" watchObservedRunningTime="2026-02-24 03:12:08.394801506 +0000 UTC m=+1052.411872349" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.403189 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-84w4c"] Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.582237 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" event={"ID":"def9fcfc-1366-4652-a4c7-aeec946c3d96","Type":"ContainerStarted","Data":"feb6787c7e98f2fbe412f3b5398b72c45d5eb2cc9c230d730805b44e2cd1686f"} Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.583637 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.585730 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-84w4c" event={"ID":"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c","Type":"ContainerStarted","Data":"ceea821318579834b048f16f403a8bf1a989c8560ec38833f5e1572cad831935"} Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.585803 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.596717 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.613090 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" podStartSLOduration=3.613025171 podStartE2EDuration="3.613025171s" podCreationTimestamp="2026-02-24 03:12:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:12:08.612176349 +0000 UTC m=+1052.629247182" watchObservedRunningTime="2026-02-24 03:12:08.613025171 +0000 UTC m=+1052.630096034" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.718019 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-swiftconf\") pod \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.718157 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-etc-swift\") pod \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.718236 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqvdh\" (UniqueName: \"kubernetes.io/projected/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-kube-api-access-xqvdh\") pod \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.718289 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-combined-ca-bundle\") pod \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.718438 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-scripts\") pod \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.718535 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-ring-data-devices\") pod \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.718587 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-dispersionconf\") pod \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\" (UID: \"c7b2b129-bd9d-4f56-8ac1-796172c3ac8b\") " Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.719022 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c7b2b129-bd9d-4f56-8ac1-796172c3ac8b" (UID: "c7b2b129-bd9d-4f56-8ac1-796172c3ac8b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.719133 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-scripts" (OuterVolumeSpecName: "scripts") pod "c7b2b129-bd9d-4f56-8ac1-796172c3ac8b" (UID: "c7b2b129-bd9d-4f56-8ac1-796172c3ac8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.720276 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.720528 4923 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.720907 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c7b2b129-bd9d-4f56-8ac1-796172c3ac8b" (UID: "c7b2b129-bd9d-4f56-8ac1-796172c3ac8b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.723387 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c7b2b129-bd9d-4f56-8ac1-796172c3ac8b" (UID: "c7b2b129-bd9d-4f56-8ac1-796172c3ac8b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.724560 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-kube-api-access-xqvdh" (OuterVolumeSpecName: "kube-api-access-xqvdh") pod "c7b2b129-bd9d-4f56-8ac1-796172c3ac8b" (UID: "c7b2b129-bd9d-4f56-8ac1-796172c3ac8b"). InnerVolumeSpecName "kube-api-access-xqvdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.726073 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c7b2b129-bd9d-4f56-8ac1-796172c3ac8b" (UID: "c7b2b129-bd9d-4f56-8ac1-796172c3ac8b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.726670 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7b2b129-bd9d-4f56-8ac1-796172c3ac8b" (UID: "c7b2b129-bd9d-4f56-8ac1-796172c3ac8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.821738 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-etc-swift\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:08 crc kubenswrapper[4923]: E0224 03:12:08.821958 4923 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.821980 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqvdh\" (UniqueName: \"kubernetes.io/projected/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-kube-api-access-xqvdh\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.822015 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.822037 4923 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.822055 4923 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:08 crc kubenswrapper[4923]: I0224 03:12:08.822074 4923 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:08 crc kubenswrapper[4923]: E0224 03:12:08.821989 4923 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 24 03:12:08 crc kubenswrapper[4923]: E0224 03:12:08.822159 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-etc-swift podName:d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f nodeName:}" failed. No retries permitted until 2026-02-24 03:12:10.822138437 +0000 UTC m=+1054.839209270 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-etc-swift") pod "swift-storage-0" (UID: "d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f") : configmap "swift-ring-files" not found Feb 24 03:12:09 crc kubenswrapper[4923]: I0224 03:12:09.173913 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 24 03:12:09 crc kubenswrapper[4923]: I0224 03:12:09.273435 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 24 03:12:09 crc kubenswrapper[4923]: I0224 03:12:09.593834 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fcvhg" Feb 24 03:12:09 crc kubenswrapper[4923]: I0224 03:12:09.638573 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-fcvhg"] Feb 24 03:12:09 crc kubenswrapper[4923]: I0224 03:12:09.643726 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-fcvhg"] Feb 24 03:12:09 crc kubenswrapper[4923]: I0224 03:12:09.731900 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba1685d9-4549-4187-a44c-32abe83b6890" path="/var/lib/kubelet/pods/ba1685d9-4549-4187-a44c-32abe83b6890/volumes" Feb 24 03:12:09 crc kubenswrapper[4923]: I0224 03:12:09.732646 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7b2b129-bd9d-4f56-8ac1-796172c3ac8b" path="/var/lib/kubelet/pods/c7b2b129-bd9d-4f56-8ac1-796172c3ac8b/volumes" Feb 24 03:12:10 crc kubenswrapper[4923]: I0224 03:12:10.868417 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-etc-swift\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:10 crc kubenswrapper[4923]: E0224 03:12:10.868616 4923 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 24 03:12:10 crc kubenswrapper[4923]: E0224 03:12:10.868659 4923 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 24 03:12:10 crc kubenswrapper[4923]: E0224 03:12:10.868726 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-etc-swift podName:d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f nodeName:}" failed. No retries permitted until 2026-02-24 03:12:14.868702409 +0000 UTC m=+1058.885773222 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-etc-swift") pod "swift-storage-0" (UID: "d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f") : configmap "swift-ring-files" not found Feb 24 03:12:11 crc kubenswrapper[4923]: I0224 03:12:11.612521 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-84w4c" event={"ID":"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c","Type":"ContainerStarted","Data":"79fa340056f3bf097bbe3e458e5c5918a0d54340c6038a1a43c419fe5e47dd49"} Feb 24 03:12:11 crc kubenswrapper[4923]: I0224 03:12:11.671875 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 24 03:12:11 crc kubenswrapper[4923]: I0224 03:12:11.672424 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 24 03:12:11 crc kubenswrapper[4923]: I0224 03:12:11.770847 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 24 03:12:11 crc kubenswrapper[4923]: I0224 03:12:11.773216 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-84w4c" podStartSLOduration=1.9039400039999999 podStartE2EDuration="4.773176259s" podCreationTimestamp="2026-02-24 03:12:07 +0000 UTC" firstStartedPulling="2026-02-24 03:12:08.414124653 +0000 UTC m=+1052.431195506" lastFinishedPulling="2026-02-24 03:12:11.283360948 +0000 UTC m=+1055.300431761" observedRunningTime="2026-02-24 03:12:11.631928143 +0000 UTC m=+1055.648998956" watchObservedRunningTime="2026-02-24 03:12:11.773176259 +0000 UTC m=+1055.790247122" Feb 24 03:12:11 crc kubenswrapper[4923]: I0224 03:12:11.784476 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-d7ggx"] Feb 24 03:12:11 crc kubenswrapper[4923]: I0224 03:12:11.785959 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d7ggx" Feb 24 03:12:11 crc kubenswrapper[4923]: I0224 03:12:11.788058 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 24 03:12:11 crc kubenswrapper[4923]: I0224 03:12:11.788715 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-d7ggx"] Feb 24 03:12:11 crc kubenswrapper[4923]: I0224 03:12:11.885348 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52f7l\" (UniqueName: \"kubernetes.io/projected/80d49324-da62-4a35-be40-046dabff25aa-kube-api-access-52f7l\") pod \"root-account-create-update-d7ggx\" (UID: \"80d49324-da62-4a35-be40-046dabff25aa\") " pod="openstack/root-account-create-update-d7ggx" Feb 24 03:12:11 crc kubenswrapper[4923]: I0224 03:12:11.885528 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80d49324-da62-4a35-be40-046dabff25aa-operator-scripts\") pod \"root-account-create-update-d7ggx\" (UID: \"80d49324-da62-4a35-be40-046dabff25aa\") " pod="openstack/root-account-create-update-d7ggx" Feb 24 03:12:11 crc kubenswrapper[4923]: I0224 03:12:11.987790 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80d49324-da62-4a35-be40-046dabff25aa-operator-scripts\") pod \"root-account-create-update-d7ggx\" (UID: \"80d49324-da62-4a35-be40-046dabff25aa\") " pod="openstack/root-account-create-update-d7ggx" Feb 24 03:12:11 crc kubenswrapper[4923]: I0224 03:12:11.987908 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52f7l\" (UniqueName: \"kubernetes.io/projected/80d49324-da62-4a35-be40-046dabff25aa-kube-api-access-52f7l\") pod \"root-account-create-update-d7ggx\" (UID: \"80d49324-da62-4a35-be40-046dabff25aa\") " pod="openstack/root-account-create-update-d7ggx" Feb 24 03:12:11 crc kubenswrapper[4923]: I0224 03:12:11.989376 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80d49324-da62-4a35-be40-046dabff25aa-operator-scripts\") pod \"root-account-create-update-d7ggx\" (UID: \"80d49324-da62-4a35-be40-046dabff25aa\") " pod="openstack/root-account-create-update-d7ggx" Feb 24 03:12:12 crc kubenswrapper[4923]: I0224 03:12:12.007038 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52f7l\" (UniqueName: \"kubernetes.io/projected/80d49324-da62-4a35-be40-046dabff25aa-kube-api-access-52f7l\") pod \"root-account-create-update-d7ggx\" (UID: \"80d49324-da62-4a35-be40-046dabff25aa\") " pod="openstack/root-account-create-update-d7ggx" Feb 24 03:12:12 crc kubenswrapper[4923]: I0224 03:12:12.111049 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d7ggx" Feb 24 03:12:12 crc kubenswrapper[4923]: I0224 03:12:12.612757 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-d7ggx"] Feb 24 03:12:12 crc kubenswrapper[4923]: W0224 03:12:12.627885 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80d49324_da62_4a35_be40_046dabff25aa.slice/crio-a36e17aceb2a2cdbf2f7ad628bd3520074f7bd03594c3df64e3aa9bbdc81244a WatchSource:0}: Error finding container a36e17aceb2a2cdbf2f7ad628bd3520074f7bd03594c3df64e3aa9bbdc81244a: Status 404 returned error can't find the container with id a36e17aceb2a2cdbf2f7ad628bd3520074f7bd03594c3df64e3aa9bbdc81244a Feb 24 03:12:12 crc kubenswrapper[4923]: I0224 03:12:12.726245 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.560095 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-krv9d"] Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.564290 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-krv9d" Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.574057 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-krv9d"] Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.640525 4923 generic.go:334] "Generic (PLEG): container finished" podID="80d49324-da62-4a35-be40-046dabff25aa" containerID="1501ce8e6dcd79ed916a4ecd4fff77e72e3339ab4d79052e4f0610f226edeca9" exitCode=0 Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.641887 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d7ggx" event={"ID":"80d49324-da62-4a35-be40-046dabff25aa","Type":"ContainerDied","Data":"1501ce8e6dcd79ed916a4ecd4fff77e72e3339ab4d79052e4f0610f226edeca9"} Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.642049 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d7ggx" event={"ID":"80d49324-da62-4a35-be40-046dabff25aa","Type":"ContainerStarted","Data":"a36e17aceb2a2cdbf2f7ad628bd3520074f7bd03594c3df64e3aa9bbdc81244a"} Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.687130 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-79bc-account-create-update-zfc9d"] Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.688180 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-79bc-account-create-update-zfc9d" Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.691702 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.698260 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-79bc-account-create-update-zfc9d"] Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.723555 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/627f800c-b94f-43f0-b2be-f7da4d5cb178-operator-scripts\") pod \"glance-db-create-krv9d\" (UID: \"627f800c-b94f-43f0-b2be-f7da4d5cb178\") " pod="openstack/glance-db-create-krv9d" Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.723628 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6rps\" (UniqueName: \"kubernetes.io/projected/627f800c-b94f-43f0-b2be-f7da4d5cb178-kube-api-access-r6rps\") pod \"glance-db-create-krv9d\" (UID: \"627f800c-b94f-43f0-b2be-f7da4d5cb178\") " pod="openstack/glance-db-create-krv9d" Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.827211 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/627f800c-b94f-43f0-b2be-f7da4d5cb178-operator-scripts\") pod \"glance-db-create-krv9d\" (UID: \"627f800c-b94f-43f0-b2be-f7da4d5cb178\") " pod="openstack/glance-db-create-krv9d" Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.827360 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abba0568-43ed-4b23-9437-8da7ed288e99-operator-scripts\") pod \"glance-79bc-account-create-update-zfc9d\" (UID: \"abba0568-43ed-4b23-9437-8da7ed288e99\") " pod="openstack/glance-79bc-account-create-update-zfc9d" Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.827408 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6rps\" (UniqueName: \"kubernetes.io/projected/627f800c-b94f-43f0-b2be-f7da4d5cb178-kube-api-access-r6rps\") pod \"glance-db-create-krv9d\" (UID: \"627f800c-b94f-43f0-b2be-f7da4d5cb178\") " pod="openstack/glance-db-create-krv9d" Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.827484 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4zgz\" (UniqueName: \"kubernetes.io/projected/abba0568-43ed-4b23-9437-8da7ed288e99-kube-api-access-n4zgz\") pod \"glance-79bc-account-create-update-zfc9d\" (UID: \"abba0568-43ed-4b23-9437-8da7ed288e99\") " pod="openstack/glance-79bc-account-create-update-zfc9d" Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.828517 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/627f800c-b94f-43f0-b2be-f7da4d5cb178-operator-scripts\") pod \"glance-db-create-krv9d\" (UID: \"627f800c-b94f-43f0-b2be-f7da4d5cb178\") " pod="openstack/glance-db-create-krv9d" Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.860155 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6rps\" (UniqueName: \"kubernetes.io/projected/627f800c-b94f-43f0-b2be-f7da4d5cb178-kube-api-access-r6rps\") pod \"glance-db-create-krv9d\" (UID: \"627f800c-b94f-43f0-b2be-f7da4d5cb178\") " pod="openstack/glance-db-create-krv9d" Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.887818 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-krv9d" Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.932331 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4zgz\" (UniqueName: \"kubernetes.io/projected/abba0568-43ed-4b23-9437-8da7ed288e99-kube-api-access-n4zgz\") pod \"glance-79bc-account-create-update-zfc9d\" (UID: \"abba0568-43ed-4b23-9437-8da7ed288e99\") " pod="openstack/glance-79bc-account-create-update-zfc9d" Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.932573 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abba0568-43ed-4b23-9437-8da7ed288e99-operator-scripts\") pod \"glance-79bc-account-create-update-zfc9d\" (UID: \"abba0568-43ed-4b23-9437-8da7ed288e99\") " pod="openstack/glance-79bc-account-create-update-zfc9d" Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.933461 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abba0568-43ed-4b23-9437-8da7ed288e99-operator-scripts\") pod \"glance-79bc-account-create-update-zfc9d\" (UID: \"abba0568-43ed-4b23-9437-8da7ed288e99\") " pod="openstack/glance-79bc-account-create-update-zfc9d" Feb 24 03:12:13 crc kubenswrapper[4923]: I0224 03:12:13.968780 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4zgz\" (UniqueName: \"kubernetes.io/projected/abba0568-43ed-4b23-9437-8da7ed288e99-kube-api-access-n4zgz\") pod \"glance-79bc-account-create-update-zfc9d\" (UID: \"abba0568-43ed-4b23-9437-8da7ed288e99\") " pod="openstack/glance-79bc-account-create-update-zfc9d" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.002145 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-79bc-account-create-update-zfc9d" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.335133 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.335176 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-2t5qn"] Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.336269 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2t5qn" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.354348 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-79bc-account-create-update-zfc9d"] Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.368171 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2t5qn"] Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.404978 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-krv9d"] Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.429253 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-351c-account-create-update-knt7d"] Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.430181 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-351c-account-create-update-knt7d" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.435936 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.448173 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f714270c-0420-4d1f-92fc-24afd3587dfc-operator-scripts\") pod \"keystone-db-create-2t5qn\" (UID: \"f714270c-0420-4d1f-92fc-24afd3587dfc\") " pod="openstack/keystone-db-create-2t5qn" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.448367 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db6mm\" (UniqueName: \"kubernetes.io/projected/f714270c-0420-4d1f-92fc-24afd3587dfc-kube-api-access-db6mm\") pod \"keystone-db-create-2t5qn\" (UID: \"f714270c-0420-4d1f-92fc-24afd3587dfc\") " pod="openstack/keystone-db-create-2t5qn" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.461898 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-351c-account-create-update-knt7d"] Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.509573 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-xlmjx"] Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.510529 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xlmjx" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.518664 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xlmjx"] Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.549634 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57faa09-2679-4d8f-92f5-45a2dccda444-operator-scripts\") pod \"keystone-351c-account-create-update-knt7d\" (UID: \"d57faa09-2679-4d8f-92f5-45a2dccda444\") " pod="openstack/keystone-351c-account-create-update-knt7d" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.549705 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f714270c-0420-4d1f-92fc-24afd3587dfc-operator-scripts\") pod \"keystone-db-create-2t5qn\" (UID: \"f714270c-0420-4d1f-92fc-24afd3587dfc\") " pod="openstack/keystone-db-create-2t5qn" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.549815 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnxz2\" (UniqueName: \"kubernetes.io/projected/d57faa09-2679-4d8f-92f5-45a2dccda444-kube-api-access-nnxz2\") pod \"keystone-351c-account-create-update-knt7d\" (UID: \"d57faa09-2679-4d8f-92f5-45a2dccda444\") " pod="openstack/keystone-351c-account-create-update-knt7d" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.549841 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db6mm\" (UniqueName: \"kubernetes.io/projected/f714270c-0420-4d1f-92fc-24afd3587dfc-kube-api-access-db6mm\") pod \"keystone-db-create-2t5qn\" (UID: \"f714270c-0420-4d1f-92fc-24afd3587dfc\") " pod="openstack/keystone-db-create-2t5qn" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.566006 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f714270c-0420-4d1f-92fc-24afd3587dfc-operator-scripts\") pod \"keystone-db-create-2t5qn\" (UID: \"f714270c-0420-4d1f-92fc-24afd3587dfc\") " pod="openstack/keystone-db-create-2t5qn" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.584191 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db6mm\" (UniqueName: \"kubernetes.io/projected/f714270c-0420-4d1f-92fc-24afd3587dfc-kube-api-access-db6mm\") pod \"keystone-db-create-2t5qn\" (UID: \"f714270c-0420-4d1f-92fc-24afd3587dfc\") " pod="openstack/keystone-db-create-2t5qn" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.618172 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5bfc-account-create-update-z6s8t"] Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.620466 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bfc-account-create-update-z6s8t" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.622449 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.623782 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5bfc-account-create-update-z6s8t"] Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.647699 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-krv9d" event={"ID":"627f800c-b94f-43f0-b2be-f7da4d5cb178","Type":"ContainerStarted","Data":"d48a3ceb5f9b48d55905d1d1f01a9fd1ffedd09b0a6ab3af9813f7695a32207b"} Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.647960 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-krv9d" event={"ID":"627f800c-b94f-43f0-b2be-f7da4d5cb178","Type":"ContainerStarted","Data":"ad91cf5691e71bec03b50226f1785aade0838a45bd842a5a3522d170c62ce6cb"} Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.655368 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-79bc-account-create-update-zfc9d" event={"ID":"abba0568-43ed-4b23-9437-8da7ed288e99","Type":"ContainerStarted","Data":"1efc4bb526339629c885d0b306bf264bcaeb482c4d4dbf127c9aa3929a278019"} Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.655411 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-79bc-account-create-update-zfc9d" event={"ID":"abba0568-43ed-4b23-9437-8da7ed288e99","Type":"ContainerStarted","Data":"8277e617cf3a7e398f0712f6500c2c5064431f0860ce35b8ee0ad99617d64dce"} Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.663798 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57faa09-2679-4d8f-92f5-45a2dccda444-operator-scripts\") pod \"keystone-351c-account-create-update-knt7d\" (UID: \"d57faa09-2679-4d8f-92f5-45a2dccda444\") " pod="openstack/keystone-351c-account-create-update-knt7d" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.663902 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a78b2aa-1ea3-4b48-bc85-128afc2bfa06-operator-scripts\") pod \"placement-db-create-xlmjx\" (UID: \"3a78b2aa-1ea3-4b48-bc85-128afc2bfa06\") " pod="openstack/placement-db-create-xlmjx" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.664022 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnxz2\" (UniqueName: \"kubernetes.io/projected/d57faa09-2679-4d8f-92f5-45a2dccda444-kube-api-access-nnxz2\") pod \"keystone-351c-account-create-update-knt7d\" (UID: \"d57faa09-2679-4d8f-92f5-45a2dccda444\") " pod="openstack/keystone-351c-account-create-update-knt7d" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.664051 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j9mz\" (UniqueName: \"kubernetes.io/projected/3a78b2aa-1ea3-4b48-bc85-128afc2bfa06-kube-api-access-2j9mz\") pod \"placement-db-create-xlmjx\" (UID: \"3a78b2aa-1ea3-4b48-bc85-128afc2bfa06\") " pod="openstack/placement-db-create-xlmjx" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.664992 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57faa09-2679-4d8f-92f5-45a2dccda444-operator-scripts\") pod \"keystone-351c-account-create-update-knt7d\" (UID: \"d57faa09-2679-4d8f-92f5-45a2dccda444\") " pod="openstack/keystone-351c-account-create-update-knt7d" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.665006 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-krv9d" podStartSLOduration=1.664989356 podStartE2EDuration="1.664989356s" podCreationTimestamp="2026-02-24 03:12:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:12:14.662901152 +0000 UTC m=+1058.679971965" watchObservedRunningTime="2026-02-24 03:12:14.664989356 +0000 UTC m=+1058.682060169" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.673992 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2t5qn" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.693973 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnxz2\" (UniqueName: \"kubernetes.io/projected/d57faa09-2679-4d8f-92f5-45a2dccda444-kube-api-access-nnxz2\") pod \"keystone-351c-account-create-update-knt7d\" (UID: \"d57faa09-2679-4d8f-92f5-45a2dccda444\") " pod="openstack/keystone-351c-account-create-update-knt7d" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.765355 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a78b2aa-1ea3-4b48-bc85-128afc2bfa06-operator-scripts\") pod \"placement-db-create-xlmjx\" (UID: \"3a78b2aa-1ea3-4b48-bc85-128afc2bfa06\") " pod="openstack/placement-db-create-xlmjx" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.765619 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdtfz\" (UniqueName: \"kubernetes.io/projected/d9ab4036-b523-47e5-ac77-f346b3f4e60f-kube-api-access-sdtfz\") pod \"placement-5bfc-account-create-update-z6s8t\" (UID: \"d9ab4036-b523-47e5-ac77-f346b3f4e60f\") " pod="openstack/placement-5bfc-account-create-update-z6s8t" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.765913 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a78b2aa-1ea3-4b48-bc85-128afc2bfa06-operator-scripts\") pod \"placement-db-create-xlmjx\" (UID: \"3a78b2aa-1ea3-4b48-bc85-128afc2bfa06\") " pod="openstack/placement-db-create-xlmjx" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.766016 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j9mz\" (UniqueName: \"kubernetes.io/projected/3a78b2aa-1ea3-4b48-bc85-128afc2bfa06-kube-api-access-2j9mz\") pod \"placement-db-create-xlmjx\" (UID: \"3a78b2aa-1ea3-4b48-bc85-128afc2bfa06\") " pod="openstack/placement-db-create-xlmjx" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.766096 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9ab4036-b523-47e5-ac77-f346b3f4e60f-operator-scripts\") pod \"placement-5bfc-account-create-update-z6s8t\" (UID: \"d9ab4036-b523-47e5-ac77-f346b3f4e60f\") " pod="openstack/placement-5bfc-account-create-update-z6s8t" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.772178 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-351c-account-create-update-knt7d" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.784260 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j9mz\" (UniqueName: \"kubernetes.io/projected/3a78b2aa-1ea3-4b48-bc85-128afc2bfa06-kube-api-access-2j9mz\") pod \"placement-db-create-xlmjx\" (UID: \"3a78b2aa-1ea3-4b48-bc85-128afc2bfa06\") " pod="openstack/placement-db-create-xlmjx" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.835672 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xlmjx" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.870148 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdtfz\" (UniqueName: \"kubernetes.io/projected/d9ab4036-b523-47e5-ac77-f346b3f4e60f-kube-api-access-sdtfz\") pod \"placement-5bfc-account-create-update-z6s8t\" (UID: \"d9ab4036-b523-47e5-ac77-f346b3f4e60f\") " pod="openstack/placement-5bfc-account-create-update-z6s8t" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.870200 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-etc-swift\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.870278 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9ab4036-b523-47e5-ac77-f346b3f4e60f-operator-scripts\") pod \"placement-5bfc-account-create-update-z6s8t\" (UID: \"d9ab4036-b523-47e5-ac77-f346b3f4e60f\") " pod="openstack/placement-5bfc-account-create-update-z6s8t" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.871373 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9ab4036-b523-47e5-ac77-f346b3f4e60f-operator-scripts\") pod \"placement-5bfc-account-create-update-z6s8t\" (UID: \"d9ab4036-b523-47e5-ac77-f346b3f4e60f\") " pod="openstack/placement-5bfc-account-create-update-z6s8t" Feb 24 03:12:14 crc kubenswrapper[4923]: E0224 03:12:14.871467 4923 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 24 03:12:14 crc kubenswrapper[4923]: E0224 03:12:14.871480 4923 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 24 03:12:14 crc kubenswrapper[4923]: E0224 03:12:14.871516 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-etc-swift podName:d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f nodeName:}" failed. No retries permitted until 2026-02-24 03:12:22.871503135 +0000 UTC m=+1066.888573948 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-etc-swift") pod "swift-storage-0" (UID: "d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f") : configmap "swift-ring-files" not found Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.909959 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdtfz\" (UniqueName: \"kubernetes.io/projected/d9ab4036-b523-47e5-ac77-f346b3f4e60f-kube-api-access-sdtfz\") pod \"placement-5bfc-account-create-update-z6s8t\" (UID: \"d9ab4036-b523-47e5-ac77-f346b3f4e60f\") " pod="openstack/placement-5bfc-account-create-update-z6s8t" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.934502 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bfc-account-create-update-z6s8t" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.975532 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-79bc-account-create-update-zfc9d" podStartSLOduration=1.9755134330000002 podStartE2EDuration="1.975513433s" podCreationTimestamp="2026-02-24 03:12:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:12:14.689746846 +0000 UTC m=+1058.706817659" watchObservedRunningTime="2026-02-24 03:12:14.975513433 +0000 UTC m=+1058.992584246" Feb 24 03:12:14 crc kubenswrapper[4923]: I0224 03:12:14.984326 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2t5qn"] Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.141396 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d7ggx" Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.205223 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5bfc-account-create-update-z6s8t"] Feb 24 03:12:15 crc kubenswrapper[4923]: W0224 03:12:15.226692 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9ab4036_b523_47e5_ac77_f346b3f4e60f.slice/crio-fc3874cbed78de1e09a69729d612e22634a2c2666a3ef04cf925957da508b8f9 WatchSource:0}: Error finding container fc3874cbed78de1e09a69729d612e22634a2c2666a3ef04cf925957da508b8f9: Status 404 returned error can't find the container with id fc3874cbed78de1e09a69729d612e22634a2c2666a3ef04cf925957da508b8f9 Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.238768 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-351c-account-create-update-knt7d"] Feb 24 03:12:15 crc kubenswrapper[4923]: W0224 03:12:15.247734 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd57faa09_2679_4d8f_92f5_45a2dccda444.slice/crio-f310095a4daea7432e4b3582ddd3a465947f87689a347b07c40e7df8d2d8f9b5 WatchSource:0}: Error finding container f310095a4daea7432e4b3582ddd3a465947f87689a347b07c40e7df8d2d8f9b5: Status 404 returned error can't find the container with id f310095a4daea7432e4b3582ddd3a465947f87689a347b07c40e7df8d2d8f9b5 Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.275615 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52f7l\" (UniqueName: \"kubernetes.io/projected/80d49324-da62-4a35-be40-046dabff25aa-kube-api-access-52f7l\") pod \"80d49324-da62-4a35-be40-046dabff25aa\" (UID: \"80d49324-da62-4a35-be40-046dabff25aa\") " Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.275851 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80d49324-da62-4a35-be40-046dabff25aa-operator-scripts\") pod \"80d49324-da62-4a35-be40-046dabff25aa\" (UID: \"80d49324-da62-4a35-be40-046dabff25aa\") " Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.276582 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80d49324-da62-4a35-be40-046dabff25aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80d49324-da62-4a35-be40-046dabff25aa" (UID: "80d49324-da62-4a35-be40-046dabff25aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.279802 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80d49324-da62-4a35-be40-046dabff25aa-kube-api-access-52f7l" (OuterVolumeSpecName: "kube-api-access-52f7l") pod "80d49324-da62-4a35-be40-046dabff25aa" (UID: "80d49324-da62-4a35-be40-046dabff25aa"). InnerVolumeSpecName "kube-api-access-52f7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.301272 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xlmjx"] Feb 24 03:12:15 crc kubenswrapper[4923]: W0224 03:12:15.319260 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a78b2aa_1ea3_4b48_bc85_128afc2bfa06.slice/crio-f4db823ecf8467f21f8bb5a7898dee595c414eeebea9ba62075e88c7aeff4a39 WatchSource:0}: Error finding container f4db823ecf8467f21f8bb5a7898dee595c414eeebea9ba62075e88c7aeff4a39: Status 404 returned error can't find the container with id f4db823ecf8467f21f8bb5a7898dee595c414eeebea9ba62075e88c7aeff4a39 Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.378187 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80d49324-da62-4a35-be40-046dabff25aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.378221 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52f7l\" (UniqueName: \"kubernetes.io/projected/80d49324-da62-4a35-be40-046dabff25aa-kube-api-access-52f7l\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.671812 4923 generic.go:334] "Generic (PLEG): container finished" podID="f714270c-0420-4d1f-92fc-24afd3587dfc" containerID="82c4f785863b9a50a42bd9130a45b9860f893cb71fd5de77ec4a3ed236614634" exitCode=0 Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.672202 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2t5qn" event={"ID":"f714270c-0420-4d1f-92fc-24afd3587dfc","Type":"ContainerDied","Data":"82c4f785863b9a50a42bd9130a45b9860f893cb71fd5de77ec4a3ed236614634"} Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.672233 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2t5qn" event={"ID":"f714270c-0420-4d1f-92fc-24afd3587dfc","Type":"ContainerStarted","Data":"d4c3fe64b02fbb7cefd7b1d4823007c1d1de1f587c0e47ba97346396411c23b3"} Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.674501 4923 generic.go:334] "Generic (PLEG): container finished" podID="3a78b2aa-1ea3-4b48-bc85-128afc2bfa06" containerID="2ba02616a977174b628241c112670c43c58695cf30f3594f8357acff57882917" exitCode=0 Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.674542 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xlmjx" event={"ID":"3a78b2aa-1ea3-4b48-bc85-128afc2bfa06","Type":"ContainerDied","Data":"2ba02616a977174b628241c112670c43c58695cf30f3594f8357acff57882917"} Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.674556 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xlmjx" event={"ID":"3a78b2aa-1ea3-4b48-bc85-128afc2bfa06","Type":"ContainerStarted","Data":"f4db823ecf8467f21f8bb5a7898dee595c414eeebea9ba62075e88c7aeff4a39"} Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.676287 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-d7ggx" event={"ID":"80d49324-da62-4a35-be40-046dabff25aa","Type":"ContainerDied","Data":"a36e17aceb2a2cdbf2f7ad628bd3520074f7bd03594c3df64e3aa9bbdc81244a"} Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.676324 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a36e17aceb2a2cdbf2f7ad628bd3520074f7bd03594c3df64e3aa9bbdc81244a" Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.676687 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-d7ggx" Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.678049 4923 generic.go:334] "Generic (PLEG): container finished" podID="627f800c-b94f-43f0-b2be-f7da4d5cb178" containerID="d48a3ceb5f9b48d55905d1d1f01a9fd1ffedd09b0a6ab3af9813f7695a32207b" exitCode=0 Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.678174 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-krv9d" event={"ID":"627f800c-b94f-43f0-b2be-f7da4d5cb178","Type":"ContainerDied","Data":"d48a3ceb5f9b48d55905d1d1f01a9fd1ffedd09b0a6ab3af9813f7695a32207b"} Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.679656 4923 generic.go:334] "Generic (PLEG): container finished" podID="abba0568-43ed-4b23-9437-8da7ed288e99" containerID="1efc4bb526339629c885d0b306bf264bcaeb482c4d4dbf127c9aa3929a278019" exitCode=0 Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.679704 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-79bc-account-create-update-zfc9d" event={"ID":"abba0568-43ed-4b23-9437-8da7ed288e99","Type":"ContainerDied","Data":"1efc4bb526339629c885d0b306bf264bcaeb482c4d4dbf127c9aa3929a278019"} Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.687551 4923 generic.go:334] "Generic (PLEG): container finished" podID="d57faa09-2679-4d8f-92f5-45a2dccda444" containerID="5e987cbbecc17dbadf8c01f3cdf877275c9210e566fb1e10242836ef9c2f4ad6" exitCode=0 Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.687656 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-351c-account-create-update-knt7d" event={"ID":"d57faa09-2679-4d8f-92f5-45a2dccda444","Type":"ContainerDied","Data":"5e987cbbecc17dbadf8c01f3cdf877275c9210e566fb1e10242836ef9c2f4ad6"} Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.687692 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-351c-account-create-update-knt7d" event={"ID":"d57faa09-2679-4d8f-92f5-45a2dccda444","Type":"ContainerStarted","Data":"f310095a4daea7432e4b3582ddd3a465947f87689a347b07c40e7df8d2d8f9b5"} Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.690667 4923 generic.go:334] "Generic (PLEG): container finished" podID="d9ab4036-b523-47e5-ac77-f346b3f4e60f" containerID="11c7a6d9684f5e25454deb86433fd0358abeb688f4a7bb99c8408450a620758a" exitCode=0 Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.690725 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bfc-account-create-update-z6s8t" event={"ID":"d9ab4036-b523-47e5-ac77-f346b3f4e60f","Type":"ContainerDied","Data":"11c7a6d9684f5e25454deb86433fd0358abeb688f4a7bb99c8408450a620758a"} Feb 24 03:12:15 crc kubenswrapper[4923]: I0224 03:12:15.690754 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bfc-account-create-update-z6s8t" event={"ID":"d9ab4036-b523-47e5-ac77-f346b3f4e60f","Type":"ContainerStarted","Data":"fc3874cbed78de1e09a69729d612e22634a2c2666a3ef04cf925957da508b8f9"} Feb 24 03:12:16 crc kubenswrapper[4923]: I0224 03:12:16.297681 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:16 crc kubenswrapper[4923]: I0224 03:12:16.366680 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-dtx22"] Feb 24 03:12:16 crc kubenswrapper[4923]: I0224 03:12:16.367053 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-dtx22" podUID="3728aab2-9679-48c9-8efa-fec1eba7bb89" containerName="dnsmasq-dns" containerID="cri-o://d8acbd6c41d8dc2e05d101ba696c2819a26de02861b170a1a78d9a8475d4a3e7" gracePeriod=10 Feb 24 03:12:16 crc kubenswrapper[4923]: I0224 03:12:16.703441 4923 generic.go:334] "Generic (PLEG): container finished" podID="3728aab2-9679-48c9-8efa-fec1eba7bb89" containerID="d8acbd6c41d8dc2e05d101ba696c2819a26de02861b170a1a78d9a8475d4a3e7" exitCode=0 Feb 24 03:12:16 crc kubenswrapper[4923]: I0224 03:12:16.704266 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-dtx22" event={"ID":"3728aab2-9679-48c9-8efa-fec1eba7bb89","Type":"ContainerDied","Data":"d8acbd6c41d8dc2e05d101ba696c2819a26de02861b170a1a78d9a8475d4a3e7"} Feb 24 03:12:16 crc kubenswrapper[4923]: I0224 03:12:16.903694 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.022570 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-ovsdbserver-nb\") pod \"3728aab2-9679-48c9-8efa-fec1eba7bb89\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.022730 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-dns-svc\") pod \"3728aab2-9679-48c9-8efa-fec1eba7bb89\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.022762 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-config\") pod \"3728aab2-9679-48c9-8efa-fec1eba7bb89\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.023578 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8cpg\" (UniqueName: \"kubernetes.io/projected/3728aab2-9679-48c9-8efa-fec1eba7bb89-kube-api-access-c8cpg\") pod \"3728aab2-9679-48c9-8efa-fec1eba7bb89\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.023625 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-ovsdbserver-sb\") pod \"3728aab2-9679-48c9-8efa-fec1eba7bb89\" (UID: \"3728aab2-9679-48c9-8efa-fec1eba7bb89\") " Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.032875 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3728aab2-9679-48c9-8efa-fec1eba7bb89-kube-api-access-c8cpg" (OuterVolumeSpecName: "kube-api-access-c8cpg") pod "3728aab2-9679-48c9-8efa-fec1eba7bb89" (UID: "3728aab2-9679-48c9-8efa-fec1eba7bb89"). InnerVolumeSpecName "kube-api-access-c8cpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.079456 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3728aab2-9679-48c9-8efa-fec1eba7bb89" (UID: "3728aab2-9679-48c9-8efa-fec1eba7bb89"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.080494 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-config" (OuterVolumeSpecName: "config") pod "3728aab2-9679-48c9-8efa-fec1eba7bb89" (UID: "3728aab2-9679-48c9-8efa-fec1eba7bb89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.080995 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3728aab2-9679-48c9-8efa-fec1eba7bb89" (UID: "3728aab2-9679-48c9-8efa-fec1eba7bb89"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.093669 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3728aab2-9679-48c9-8efa-fec1eba7bb89" (UID: "3728aab2-9679-48c9-8efa-fec1eba7bb89"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.125385 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.125420 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8cpg\" (UniqueName: \"kubernetes.io/projected/3728aab2-9679-48c9-8efa-fec1eba7bb89-kube-api-access-c8cpg\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.125432 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.125440 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.125448 4923 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3728aab2-9679-48c9-8efa-fec1eba7bb89-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.156878 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bfc-account-create-update-z6s8t" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.226445 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdtfz\" (UniqueName: \"kubernetes.io/projected/d9ab4036-b523-47e5-ac77-f346b3f4e60f-kube-api-access-sdtfz\") pod \"d9ab4036-b523-47e5-ac77-f346b3f4e60f\" (UID: \"d9ab4036-b523-47e5-ac77-f346b3f4e60f\") " Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.226583 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9ab4036-b523-47e5-ac77-f346b3f4e60f-operator-scripts\") pod \"d9ab4036-b523-47e5-ac77-f346b3f4e60f\" (UID: \"d9ab4036-b523-47e5-ac77-f346b3f4e60f\") " Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.229651 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9ab4036-b523-47e5-ac77-f346b3f4e60f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d9ab4036-b523-47e5-ac77-f346b3f4e60f" (UID: "d9ab4036-b523-47e5-ac77-f346b3f4e60f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.234096 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9ab4036-b523-47e5-ac77-f346b3f4e60f-kube-api-access-sdtfz" (OuterVolumeSpecName: "kube-api-access-sdtfz") pod "d9ab4036-b523-47e5-ac77-f346b3f4e60f" (UID: "d9ab4036-b523-47e5-ac77-f346b3f4e60f"). InnerVolumeSpecName "kube-api-access-sdtfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.265706 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-351c-account-create-update-knt7d" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.272462 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xlmjx" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.280896 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-79bc-account-create-update-zfc9d" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.295154 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2t5qn" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.311348 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-krv9d" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.328282 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdtfz\" (UniqueName: \"kubernetes.io/projected/d9ab4036-b523-47e5-ac77-f346b3f4e60f-kube-api-access-sdtfz\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.328879 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9ab4036-b523-47e5-ac77-f346b3f4e60f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.430043 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/627f800c-b94f-43f0-b2be-f7da4d5cb178-operator-scripts\") pod \"627f800c-b94f-43f0-b2be-f7da4d5cb178\" (UID: \"627f800c-b94f-43f0-b2be-f7da4d5cb178\") " Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.430329 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4zgz\" (UniqueName: \"kubernetes.io/projected/abba0568-43ed-4b23-9437-8da7ed288e99-kube-api-access-n4zgz\") pod \"abba0568-43ed-4b23-9437-8da7ed288e99\" (UID: \"abba0568-43ed-4b23-9437-8da7ed288e99\") " Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.430497 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57faa09-2679-4d8f-92f5-45a2dccda444-operator-scripts\") pod \"d57faa09-2679-4d8f-92f5-45a2dccda444\" (UID: \"d57faa09-2679-4d8f-92f5-45a2dccda444\") " Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.430713 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abba0568-43ed-4b23-9437-8da7ed288e99-operator-scripts\") pod \"abba0568-43ed-4b23-9437-8da7ed288e99\" (UID: \"abba0568-43ed-4b23-9437-8da7ed288e99\") " Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.430855 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f714270c-0420-4d1f-92fc-24afd3587dfc-operator-scripts\") pod \"f714270c-0420-4d1f-92fc-24afd3587dfc\" (UID: \"f714270c-0420-4d1f-92fc-24afd3587dfc\") " Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.430954 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnxz2\" (UniqueName: \"kubernetes.io/projected/d57faa09-2679-4d8f-92f5-45a2dccda444-kube-api-access-nnxz2\") pod \"d57faa09-2679-4d8f-92f5-45a2dccda444\" (UID: \"d57faa09-2679-4d8f-92f5-45a2dccda444\") " Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.431063 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6rps\" (UniqueName: \"kubernetes.io/projected/627f800c-b94f-43f0-b2be-f7da4d5cb178-kube-api-access-r6rps\") pod \"627f800c-b94f-43f0-b2be-f7da4d5cb178\" (UID: \"627f800c-b94f-43f0-b2be-f7da4d5cb178\") " Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.431190 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a78b2aa-1ea3-4b48-bc85-128afc2bfa06-operator-scripts\") pod \"3a78b2aa-1ea3-4b48-bc85-128afc2bfa06\" (UID: \"3a78b2aa-1ea3-4b48-bc85-128afc2bfa06\") " Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.431371 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db6mm\" (UniqueName: \"kubernetes.io/projected/f714270c-0420-4d1f-92fc-24afd3587dfc-kube-api-access-db6mm\") pod \"f714270c-0420-4d1f-92fc-24afd3587dfc\" (UID: \"f714270c-0420-4d1f-92fc-24afd3587dfc\") " Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.431525 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j9mz\" (UniqueName: \"kubernetes.io/projected/3a78b2aa-1ea3-4b48-bc85-128afc2bfa06-kube-api-access-2j9mz\") pod \"3a78b2aa-1ea3-4b48-bc85-128afc2bfa06\" (UID: \"3a78b2aa-1ea3-4b48-bc85-128afc2bfa06\") " Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.433455 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/627f800c-b94f-43f0-b2be-f7da4d5cb178-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "627f800c-b94f-43f0-b2be-f7da4d5cb178" (UID: "627f800c-b94f-43f0-b2be-f7da4d5cb178"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.433571 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abba0568-43ed-4b23-9437-8da7ed288e99-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "abba0568-43ed-4b23-9437-8da7ed288e99" (UID: "abba0568-43ed-4b23-9437-8da7ed288e99"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.433971 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d57faa09-2679-4d8f-92f5-45a2dccda444-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d57faa09-2679-4d8f-92f5-45a2dccda444" (UID: "d57faa09-2679-4d8f-92f5-45a2dccda444"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.435333 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a78b2aa-1ea3-4b48-bc85-128afc2bfa06-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3a78b2aa-1ea3-4b48-bc85-128afc2bfa06" (UID: "3a78b2aa-1ea3-4b48-bc85-128afc2bfa06"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.435757 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f714270c-0420-4d1f-92fc-24afd3587dfc-kube-api-access-db6mm" (OuterVolumeSpecName: "kube-api-access-db6mm") pod "f714270c-0420-4d1f-92fc-24afd3587dfc" (UID: "f714270c-0420-4d1f-92fc-24afd3587dfc"). InnerVolumeSpecName "kube-api-access-db6mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.436116 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abba0568-43ed-4b23-9437-8da7ed288e99-kube-api-access-n4zgz" (OuterVolumeSpecName: "kube-api-access-n4zgz") pod "abba0568-43ed-4b23-9437-8da7ed288e99" (UID: "abba0568-43ed-4b23-9437-8da7ed288e99"). InnerVolumeSpecName "kube-api-access-n4zgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.436273 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a78b2aa-1ea3-4b48-bc85-128afc2bfa06-kube-api-access-2j9mz" (OuterVolumeSpecName: "kube-api-access-2j9mz") pod "3a78b2aa-1ea3-4b48-bc85-128afc2bfa06" (UID: "3a78b2aa-1ea3-4b48-bc85-128afc2bfa06"). InnerVolumeSpecName "kube-api-access-2j9mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.436664 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f714270c-0420-4d1f-92fc-24afd3587dfc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f714270c-0420-4d1f-92fc-24afd3587dfc" (UID: "f714270c-0420-4d1f-92fc-24afd3587dfc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.437140 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d57faa09-2679-4d8f-92f5-45a2dccda444-kube-api-access-nnxz2" (OuterVolumeSpecName: "kube-api-access-nnxz2") pod "d57faa09-2679-4d8f-92f5-45a2dccda444" (UID: "d57faa09-2679-4d8f-92f5-45a2dccda444"). InnerVolumeSpecName "kube-api-access-nnxz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.437215 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/627f800c-b94f-43f0-b2be-f7da4d5cb178-kube-api-access-r6rps" (OuterVolumeSpecName: "kube-api-access-r6rps") pod "627f800c-b94f-43f0-b2be-f7da4d5cb178" (UID: "627f800c-b94f-43f0-b2be-f7da4d5cb178"). InnerVolumeSpecName "kube-api-access-r6rps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.533099 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/627f800c-b94f-43f0-b2be-f7da4d5cb178-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.533126 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4zgz\" (UniqueName: \"kubernetes.io/projected/abba0568-43ed-4b23-9437-8da7ed288e99-kube-api-access-n4zgz\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.533138 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57faa09-2679-4d8f-92f5-45a2dccda444-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.533147 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abba0568-43ed-4b23-9437-8da7ed288e99-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.533159 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f714270c-0420-4d1f-92fc-24afd3587dfc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.533168 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnxz2\" (UniqueName: \"kubernetes.io/projected/d57faa09-2679-4d8f-92f5-45a2dccda444-kube-api-access-nnxz2\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.533177 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6rps\" (UniqueName: \"kubernetes.io/projected/627f800c-b94f-43f0-b2be-f7da4d5cb178-kube-api-access-r6rps\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.533186 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a78b2aa-1ea3-4b48-bc85-128afc2bfa06-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.533194 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db6mm\" (UniqueName: \"kubernetes.io/projected/f714270c-0420-4d1f-92fc-24afd3587dfc-kube-api-access-db6mm\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.533202 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j9mz\" (UniqueName: \"kubernetes.io/projected/3a78b2aa-1ea3-4b48-bc85-128afc2bfa06-kube-api-access-2j9mz\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.716014 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xlmjx" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.718568 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-dtx22" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.726639 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-krv9d" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.731319 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xlmjx" event={"ID":"3a78b2aa-1ea3-4b48-bc85-128afc2bfa06","Type":"ContainerDied","Data":"f4db823ecf8467f21f8bb5a7898dee595c414eeebea9ba62075e88c7aeff4a39"} Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.731375 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4db823ecf8467f21f8bb5a7898dee595c414eeebea9ba62075e88c7aeff4a39" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.731392 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-dtx22" event={"ID":"3728aab2-9679-48c9-8efa-fec1eba7bb89","Type":"ContainerDied","Data":"de788a7e95256cde5cc76a5cdb29faa471f99bcc88fc6c2f19da178c43a72e2d"} Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.731414 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-krv9d" event={"ID":"627f800c-b94f-43f0-b2be-f7da4d5cb178","Type":"ContainerDied","Data":"ad91cf5691e71bec03b50226f1785aade0838a45bd842a5a3522d170c62ce6cb"} Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.731430 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad91cf5691e71bec03b50226f1785aade0838a45bd842a5a3522d170c62ce6cb" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.731451 4923 scope.go:117] "RemoveContainer" containerID="d8acbd6c41d8dc2e05d101ba696c2819a26de02861b170a1a78d9a8475d4a3e7" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.732233 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-79bc-account-create-update-zfc9d" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.732261 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-79bc-account-create-update-zfc9d" event={"ID":"abba0568-43ed-4b23-9437-8da7ed288e99","Type":"ContainerDied","Data":"8277e617cf3a7e398f0712f6500c2c5064431f0860ce35b8ee0ad99617d64dce"} Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.732319 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8277e617cf3a7e398f0712f6500c2c5064431f0860ce35b8ee0ad99617d64dce" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.737441 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-351c-account-create-update-knt7d" event={"ID":"d57faa09-2679-4d8f-92f5-45a2dccda444","Type":"ContainerDied","Data":"f310095a4daea7432e4b3582ddd3a465947f87689a347b07c40e7df8d2d8f9b5"} Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.737483 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f310095a4daea7432e4b3582ddd3a465947f87689a347b07c40e7df8d2d8f9b5" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.737461 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-351c-account-create-update-knt7d" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.739646 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bfc-account-create-update-z6s8t" event={"ID":"d9ab4036-b523-47e5-ac77-f346b3f4e60f","Type":"ContainerDied","Data":"fc3874cbed78de1e09a69729d612e22634a2c2666a3ef04cf925957da508b8f9"} Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.739728 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc3874cbed78de1e09a69729d612e22634a2c2666a3ef04cf925957da508b8f9" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.739820 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bfc-account-create-update-z6s8t" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.742579 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2t5qn" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.742590 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2t5qn" event={"ID":"f714270c-0420-4d1f-92fc-24afd3587dfc","Type":"ContainerDied","Data":"d4c3fe64b02fbb7cefd7b1d4823007c1d1de1f587c0e47ba97346396411c23b3"} Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.742680 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4c3fe64b02fbb7cefd7b1d4823007c1d1de1f587c0e47ba97346396411c23b3" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.778198 4923 scope.go:117] "RemoveContainer" containerID="8920583344df99cb575a7f3bd3851daa40f98f2460ab6fd5d8af8550d189c373" Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.827697 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-dtx22"] Feb 24 03:12:17 crc kubenswrapper[4923]: I0224 03:12:17.837398 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-dtx22"] Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.749790 4923 generic.go:334] "Generic (PLEG): container finished" podID="6113f2e8-dd3f-42d4-92f3-8fd56e4b458c" containerID="79fa340056f3bf097bbe3e458e5c5918a0d54340c6038a1a43c419fe5e47dd49" exitCode=0 Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.749882 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-84w4c" event={"ID":"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c","Type":"ContainerDied","Data":"79fa340056f3bf097bbe3e458e5c5918a0d54340c6038a1a43c419fe5e47dd49"} Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.799579 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-s6tmr"] Feb 24 03:12:18 crc kubenswrapper[4923]: E0224 03:12:18.799887 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abba0568-43ed-4b23-9437-8da7ed288e99" containerName="mariadb-account-create-update" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.799906 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="abba0568-43ed-4b23-9437-8da7ed288e99" containerName="mariadb-account-create-update" Feb 24 03:12:18 crc kubenswrapper[4923]: E0224 03:12:18.799923 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ab4036-b523-47e5-ac77-f346b3f4e60f" containerName="mariadb-account-create-update" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.799931 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ab4036-b523-47e5-ac77-f346b3f4e60f" containerName="mariadb-account-create-update" Feb 24 03:12:18 crc kubenswrapper[4923]: E0224 03:12:18.799943 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f714270c-0420-4d1f-92fc-24afd3587dfc" containerName="mariadb-database-create" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.799950 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="f714270c-0420-4d1f-92fc-24afd3587dfc" containerName="mariadb-database-create" Feb 24 03:12:18 crc kubenswrapper[4923]: E0224 03:12:18.799966 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627f800c-b94f-43f0-b2be-f7da4d5cb178" containerName="mariadb-database-create" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.799971 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="627f800c-b94f-43f0-b2be-f7da4d5cb178" containerName="mariadb-database-create" Feb 24 03:12:18 crc kubenswrapper[4923]: E0224 03:12:18.799985 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3728aab2-9679-48c9-8efa-fec1eba7bb89" containerName="dnsmasq-dns" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.799991 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="3728aab2-9679-48c9-8efa-fec1eba7bb89" containerName="dnsmasq-dns" Feb 24 03:12:18 crc kubenswrapper[4923]: E0224 03:12:18.800004 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a78b2aa-1ea3-4b48-bc85-128afc2bfa06" containerName="mariadb-database-create" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.800011 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a78b2aa-1ea3-4b48-bc85-128afc2bfa06" containerName="mariadb-database-create" Feb 24 03:12:18 crc kubenswrapper[4923]: E0224 03:12:18.800019 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3728aab2-9679-48c9-8efa-fec1eba7bb89" containerName="init" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.800025 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="3728aab2-9679-48c9-8efa-fec1eba7bb89" containerName="init" Feb 24 03:12:18 crc kubenswrapper[4923]: E0224 03:12:18.800035 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80d49324-da62-4a35-be40-046dabff25aa" containerName="mariadb-account-create-update" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.800041 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="80d49324-da62-4a35-be40-046dabff25aa" containerName="mariadb-account-create-update" Feb 24 03:12:18 crc kubenswrapper[4923]: E0224 03:12:18.800053 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d57faa09-2679-4d8f-92f5-45a2dccda444" containerName="mariadb-account-create-update" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.800059 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="d57faa09-2679-4d8f-92f5-45a2dccda444" containerName="mariadb-account-create-update" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.800200 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="80d49324-da62-4a35-be40-046dabff25aa" containerName="mariadb-account-create-update" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.800212 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="3728aab2-9679-48c9-8efa-fec1eba7bb89" containerName="dnsmasq-dns" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.800225 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="abba0568-43ed-4b23-9437-8da7ed288e99" containerName="mariadb-account-create-update" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.800233 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="d57faa09-2679-4d8f-92f5-45a2dccda444" containerName="mariadb-account-create-update" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.800242 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="f714270c-0420-4d1f-92fc-24afd3587dfc" containerName="mariadb-database-create" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.800248 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9ab4036-b523-47e5-ac77-f346b3f4e60f" containerName="mariadb-account-create-update" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.800257 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="627f800c-b94f-43f0-b2be-f7da4d5cb178" containerName="mariadb-database-create" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.800265 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a78b2aa-1ea3-4b48-bc85-128afc2bfa06" containerName="mariadb-database-create" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.800748 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s6tmr" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.803821 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-p2qxr" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.804120 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.818036 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-s6tmr"] Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.955490 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-config-data\") pod \"glance-db-sync-s6tmr\" (UID: \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\") " pod="openstack/glance-db-sync-s6tmr" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.955550 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-db-sync-config-data\") pod \"glance-db-sync-s6tmr\" (UID: \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\") " pod="openstack/glance-db-sync-s6tmr" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.955577 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-combined-ca-bundle\") pod \"glance-db-sync-s6tmr\" (UID: \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\") " pod="openstack/glance-db-sync-s6tmr" Feb 24 03:12:18 crc kubenswrapper[4923]: I0224 03:12:18.955595 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4f7z\" (UniqueName: \"kubernetes.io/projected/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-kube-api-access-w4f7z\") pod \"glance-db-sync-s6tmr\" (UID: \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\") " pod="openstack/glance-db-sync-s6tmr" Feb 24 03:12:19 crc kubenswrapper[4923]: I0224 03:12:19.057380 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-config-data\") pod \"glance-db-sync-s6tmr\" (UID: \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\") " pod="openstack/glance-db-sync-s6tmr" Feb 24 03:12:19 crc kubenswrapper[4923]: I0224 03:12:19.057438 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-db-sync-config-data\") pod \"glance-db-sync-s6tmr\" (UID: \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\") " pod="openstack/glance-db-sync-s6tmr" Feb 24 03:12:19 crc kubenswrapper[4923]: I0224 03:12:19.057462 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-combined-ca-bundle\") pod \"glance-db-sync-s6tmr\" (UID: \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\") " pod="openstack/glance-db-sync-s6tmr" Feb 24 03:12:19 crc kubenswrapper[4923]: I0224 03:12:19.057479 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4f7z\" (UniqueName: \"kubernetes.io/projected/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-kube-api-access-w4f7z\") pod \"glance-db-sync-s6tmr\" (UID: \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\") " pod="openstack/glance-db-sync-s6tmr" Feb 24 03:12:19 crc kubenswrapper[4923]: I0224 03:12:19.063005 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-config-data\") pod \"glance-db-sync-s6tmr\" (UID: \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\") " pod="openstack/glance-db-sync-s6tmr" Feb 24 03:12:19 crc kubenswrapper[4923]: I0224 03:12:19.063097 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-combined-ca-bundle\") pod \"glance-db-sync-s6tmr\" (UID: \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\") " pod="openstack/glance-db-sync-s6tmr" Feb 24 03:12:19 crc kubenswrapper[4923]: I0224 03:12:19.064909 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-db-sync-config-data\") pod \"glance-db-sync-s6tmr\" (UID: \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\") " pod="openstack/glance-db-sync-s6tmr" Feb 24 03:12:19 crc kubenswrapper[4923]: I0224 03:12:19.075521 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4f7z\" (UniqueName: \"kubernetes.io/projected/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-kube-api-access-w4f7z\") pod \"glance-db-sync-s6tmr\" (UID: \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\") " pod="openstack/glance-db-sync-s6tmr" Feb 24 03:12:19 crc kubenswrapper[4923]: I0224 03:12:19.115012 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s6tmr" Feb 24 03:12:19 crc kubenswrapper[4923]: I0224 03:12:19.677166 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-s6tmr"] Feb 24 03:12:19 crc kubenswrapper[4923]: I0224 03:12:19.723170 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3728aab2-9679-48c9-8efa-fec1eba7bb89" path="/var/lib/kubelet/pods/3728aab2-9679-48c9-8efa-fec1eba7bb89/volumes" Feb 24 03:12:19 crc kubenswrapper[4923]: I0224 03:12:19.765203 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s6tmr" event={"ID":"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0","Type":"ContainerStarted","Data":"bb7dfc86731afea6f3d3cc61c68c9b327924f74fce34d1a8241b2b58be23cb0c"} Feb 24 03:12:19 crc kubenswrapper[4923]: I0224 03:12:19.986867 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.077264 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgxk8\" (UniqueName: \"kubernetes.io/projected/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-kube-api-access-kgxk8\") pod \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.077341 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-swiftconf\") pod \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.077377 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-scripts\") pod \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.077445 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-combined-ca-bundle\") pod \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.077481 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-dispersionconf\") pod \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.077499 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-etc-swift\") pod \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.077597 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-ring-data-devices\") pod \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\" (UID: \"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c\") " Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.078728 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6113f2e8-dd3f-42d4-92f3-8fd56e4b458c" (UID: "6113f2e8-dd3f-42d4-92f3-8fd56e4b458c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.080969 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6113f2e8-dd3f-42d4-92f3-8fd56e4b458c" (UID: "6113f2e8-dd3f-42d4-92f3-8fd56e4b458c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.084475 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-kube-api-access-kgxk8" (OuterVolumeSpecName: "kube-api-access-kgxk8") pod "6113f2e8-dd3f-42d4-92f3-8fd56e4b458c" (UID: "6113f2e8-dd3f-42d4-92f3-8fd56e4b458c"). InnerVolumeSpecName "kube-api-access-kgxk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.087049 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6113f2e8-dd3f-42d4-92f3-8fd56e4b458c" (UID: "6113f2e8-dd3f-42d4-92f3-8fd56e4b458c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.098253 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-scripts" (OuterVolumeSpecName: "scripts") pod "6113f2e8-dd3f-42d4-92f3-8fd56e4b458c" (UID: "6113f2e8-dd3f-42d4-92f3-8fd56e4b458c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.100699 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6113f2e8-dd3f-42d4-92f3-8fd56e4b458c" (UID: "6113f2e8-dd3f-42d4-92f3-8fd56e4b458c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.104793 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6113f2e8-dd3f-42d4-92f3-8fd56e4b458c" (UID: "6113f2e8-dd3f-42d4-92f3-8fd56e4b458c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.179919 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgxk8\" (UniqueName: \"kubernetes.io/projected/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-kube-api-access-kgxk8\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.180191 4923 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.180201 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.180210 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.180218 4923 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.180229 4923 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.180237 4923 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6113f2e8-dd3f-42d4-92f3-8fd56e4b458c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.278353 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-d7ggx"] Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.288886 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-d7ggx"] Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.299553 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bs9t5"] Feb 24 03:12:20 crc kubenswrapper[4923]: E0224 03:12:20.299935 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6113f2e8-dd3f-42d4-92f3-8fd56e4b458c" containerName="swift-ring-rebalance" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.299953 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="6113f2e8-dd3f-42d4-92f3-8fd56e4b458c" containerName="swift-ring-rebalance" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.300128 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="6113f2e8-dd3f-42d4-92f3-8fd56e4b458c" containerName="swift-ring-rebalance" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.300681 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bs9t5" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.305564 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bs9t5"] Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.305965 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.382388 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n8k6\" (UniqueName: \"kubernetes.io/projected/16745ab2-3d14-4ee0-9385-f3f8913b865e-kube-api-access-4n8k6\") pod \"root-account-create-update-bs9t5\" (UID: \"16745ab2-3d14-4ee0-9385-f3f8913b865e\") " pod="openstack/root-account-create-update-bs9t5" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.382851 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16745ab2-3d14-4ee0-9385-f3f8913b865e-operator-scripts\") pod \"root-account-create-update-bs9t5\" (UID: \"16745ab2-3d14-4ee0-9385-f3f8913b865e\") " pod="openstack/root-account-create-update-bs9t5" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.484544 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n8k6\" (UniqueName: \"kubernetes.io/projected/16745ab2-3d14-4ee0-9385-f3f8913b865e-kube-api-access-4n8k6\") pod \"root-account-create-update-bs9t5\" (UID: \"16745ab2-3d14-4ee0-9385-f3f8913b865e\") " pod="openstack/root-account-create-update-bs9t5" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.484697 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16745ab2-3d14-4ee0-9385-f3f8913b865e-operator-scripts\") pod \"root-account-create-update-bs9t5\" (UID: \"16745ab2-3d14-4ee0-9385-f3f8913b865e\") " pod="openstack/root-account-create-update-bs9t5" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.486252 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16745ab2-3d14-4ee0-9385-f3f8913b865e-operator-scripts\") pod \"root-account-create-update-bs9t5\" (UID: \"16745ab2-3d14-4ee0-9385-f3f8913b865e\") " pod="openstack/root-account-create-update-bs9t5" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.508480 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n8k6\" (UniqueName: \"kubernetes.io/projected/16745ab2-3d14-4ee0-9385-f3f8913b865e-kube-api-access-4n8k6\") pod \"root-account-create-update-bs9t5\" (UID: \"16745ab2-3d14-4ee0-9385-f3f8913b865e\") " pod="openstack/root-account-create-update-bs9t5" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.618701 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bs9t5" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.776566 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-84w4c" event={"ID":"6113f2e8-dd3f-42d4-92f3-8fd56e4b458c","Type":"ContainerDied","Data":"ceea821318579834b048f16f403a8bf1a989c8560ec38833f5e1572cad831935"} Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.776604 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ceea821318579834b048f16f403a8bf1a989c8560ec38833f5e1572cad831935" Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.776654 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-84w4c" Feb 24 03:12:20 crc kubenswrapper[4923]: W0224 03:12:20.921074 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16745ab2_3d14_4ee0_9385_f3f8913b865e.slice/crio-7be878ba410553e864414673876739d80e4b83d50818dfe19948b6f392b27304 WatchSource:0}: Error finding container 7be878ba410553e864414673876739d80e4b83d50818dfe19948b6f392b27304: Status 404 returned error can't find the container with id 7be878ba410553e864414673876739d80e4b83d50818dfe19948b6f392b27304 Feb 24 03:12:20 crc kubenswrapper[4923]: I0224 03:12:20.925397 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bs9t5"] Feb 24 03:12:21 crc kubenswrapper[4923]: E0224 03:12:21.389066 4923 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16745ab2_3d14_4ee0_9385_f3f8913b865e.slice/crio-7eeaf19ae8f81b8e45e7b0e92abe21d4fe177989adcff7aa93b3d31836471735.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16745ab2_3d14_4ee0_9385_f3f8913b865e.slice/crio-conmon-7eeaf19ae8f81b8e45e7b0e92abe21d4fe177989adcff7aa93b3d31836471735.scope\": RecentStats: unable to find data in memory cache]" Feb 24 03:12:21 crc kubenswrapper[4923]: I0224 03:12:21.722057 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80d49324-da62-4a35-be40-046dabff25aa" path="/var/lib/kubelet/pods/80d49324-da62-4a35-be40-046dabff25aa/volumes" Feb 24 03:12:21 crc kubenswrapper[4923]: I0224 03:12:21.791101 4923 generic.go:334] "Generic (PLEG): container finished" podID="16745ab2-3d14-4ee0-9385-f3f8913b865e" containerID="7eeaf19ae8f81b8e45e7b0e92abe21d4fe177989adcff7aa93b3d31836471735" exitCode=0 Feb 24 03:12:21 crc kubenswrapper[4923]: I0224 03:12:21.791153 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bs9t5" event={"ID":"16745ab2-3d14-4ee0-9385-f3f8913b865e","Type":"ContainerDied","Data":"7eeaf19ae8f81b8e45e7b0e92abe21d4fe177989adcff7aa93b3d31836471735"} Feb 24 03:12:21 crc kubenswrapper[4923]: I0224 03:12:21.791180 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bs9t5" event={"ID":"16745ab2-3d14-4ee0-9385-f3f8913b865e","Type":"ContainerStarted","Data":"7be878ba410553e864414673876739d80e4b83d50818dfe19948b6f392b27304"} Feb 24 03:12:22 crc kubenswrapper[4923]: I0224 03:12:22.922504 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-etc-swift\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:22 crc kubenswrapper[4923]: I0224 03:12:22.928973 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f-etc-swift\") pod \"swift-storage-0\" (UID: \"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f\") " pod="openstack/swift-storage-0" Feb 24 03:12:23 crc kubenswrapper[4923]: I0224 03:12:23.018558 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 24 03:12:23 crc kubenswrapper[4923]: I0224 03:12:23.178172 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bs9t5" Feb 24 03:12:23 crc kubenswrapper[4923]: I0224 03:12:23.228403 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n8k6\" (UniqueName: \"kubernetes.io/projected/16745ab2-3d14-4ee0-9385-f3f8913b865e-kube-api-access-4n8k6\") pod \"16745ab2-3d14-4ee0-9385-f3f8913b865e\" (UID: \"16745ab2-3d14-4ee0-9385-f3f8913b865e\") " Feb 24 03:12:23 crc kubenswrapper[4923]: I0224 03:12:23.228473 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16745ab2-3d14-4ee0-9385-f3f8913b865e-operator-scripts\") pod \"16745ab2-3d14-4ee0-9385-f3f8913b865e\" (UID: \"16745ab2-3d14-4ee0-9385-f3f8913b865e\") " Feb 24 03:12:23 crc kubenswrapper[4923]: I0224 03:12:23.229678 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16745ab2-3d14-4ee0-9385-f3f8913b865e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "16745ab2-3d14-4ee0-9385-f3f8913b865e" (UID: "16745ab2-3d14-4ee0-9385-f3f8913b865e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:23 crc kubenswrapper[4923]: I0224 03:12:23.230172 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16745ab2-3d14-4ee0-9385-f3f8913b865e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:23 crc kubenswrapper[4923]: I0224 03:12:23.235340 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16745ab2-3d14-4ee0-9385-f3f8913b865e-kube-api-access-4n8k6" (OuterVolumeSpecName: "kube-api-access-4n8k6") pod "16745ab2-3d14-4ee0-9385-f3f8913b865e" (UID: "16745ab2-3d14-4ee0-9385-f3f8913b865e"). InnerVolumeSpecName "kube-api-access-4n8k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:23 crc kubenswrapper[4923]: I0224 03:12:23.330991 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n8k6\" (UniqueName: \"kubernetes.io/projected/16745ab2-3d14-4ee0-9385-f3f8913b865e-kube-api-access-4n8k6\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:23 crc kubenswrapper[4923]: I0224 03:12:23.529981 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 24 03:12:23 crc kubenswrapper[4923]: I0224 03:12:23.807013 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f","Type":"ContainerStarted","Data":"3cd936e63b377794d5b54217f821009d1ed16144f6d17d0c3dbcdcdaf3efaa89"} Feb 24 03:12:23 crc kubenswrapper[4923]: I0224 03:12:23.808645 4923 generic.go:334] "Generic (PLEG): container finished" podID="6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" containerID="780d10fe0138b329c1369b0f1cbd1e1e5c8fbef05d9caf5651b24f4d82a4f4d2" exitCode=0 Feb 24 03:12:23 crc kubenswrapper[4923]: I0224 03:12:23.808697 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0","Type":"ContainerDied","Data":"780d10fe0138b329c1369b0f1cbd1e1e5c8fbef05d9caf5651b24f4d82a4f4d2"} Feb 24 03:12:23 crc kubenswrapper[4923]: I0224 03:12:23.811330 4923 generic.go:334] "Generic (PLEG): container finished" podID="d9950d0b-d980-4e4f-82b4-9f616c6c50a3" containerID="1ef05ab77af0e174ff2a3a3a25eb2a8838b22904e83d0e1d6e1693dfaaf19763" exitCode=0 Feb 24 03:12:23 crc kubenswrapper[4923]: I0224 03:12:23.811420 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d9950d0b-d980-4e4f-82b4-9f616c6c50a3","Type":"ContainerDied","Data":"1ef05ab77af0e174ff2a3a3a25eb2a8838b22904e83d0e1d6e1693dfaaf19763"} Feb 24 03:12:23 crc kubenswrapper[4923]: I0224 03:12:23.816106 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bs9t5" event={"ID":"16745ab2-3d14-4ee0-9385-f3f8913b865e","Type":"ContainerDied","Data":"7be878ba410553e864414673876739d80e4b83d50818dfe19948b6f392b27304"} Feb 24 03:12:23 crc kubenswrapper[4923]: I0224 03:12:23.816152 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7be878ba410553e864414673876739d80e4b83d50818dfe19948b6f392b27304" Feb 24 03:12:23 crc kubenswrapper[4923]: I0224 03:12:23.816222 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bs9t5" Feb 24 03:12:24 crc kubenswrapper[4923]: I0224 03:12:24.464024 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 24 03:12:24 crc kubenswrapper[4923]: I0224 03:12:24.825891 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0","Type":"ContainerStarted","Data":"f214badca1940336d886b38fbd14ffa06f3f3192e9d3d1cef2ec79f7bef5c6b0"} Feb 24 03:12:24 crc kubenswrapper[4923]: I0224 03:12:24.826712 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:12:24 crc kubenswrapper[4923]: I0224 03:12:24.831337 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d9950d0b-d980-4e4f-82b4-9f616c6c50a3","Type":"ContainerStarted","Data":"713435589abcdf1b3efff239c9809bb5f70abfdf930ae064d9f58c2904554131"} Feb 24 03:12:24 crc kubenswrapper[4923]: I0224 03:12:24.831592 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 24 03:12:24 crc kubenswrapper[4923]: I0224 03:12:24.850722 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.843154463 podStartE2EDuration="55.850701032s" podCreationTimestamp="2026-02-24 03:11:29 +0000 UTC" firstStartedPulling="2026-02-24 03:11:35.593399818 +0000 UTC m=+1019.610470641" lastFinishedPulling="2026-02-24 03:11:50.600946397 +0000 UTC m=+1034.618017210" observedRunningTime="2026-02-24 03:12:24.842722983 +0000 UTC m=+1068.859793796" watchObservedRunningTime="2026-02-24 03:12:24.850701032 +0000 UTC m=+1068.867771845" Feb 24 03:12:24 crc kubenswrapper[4923]: I0224 03:12:24.877070 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.059943703 podStartE2EDuration="56.877041983s" podCreationTimestamp="2026-02-24 03:11:28 +0000 UTC" firstStartedPulling="2026-02-24 03:11:30.793145091 +0000 UTC m=+1014.810215904" lastFinishedPulling="2026-02-24 03:11:50.610243371 +0000 UTC m=+1034.627314184" observedRunningTime="2026-02-24 03:12:24.868046927 +0000 UTC m=+1068.885117780" watchObservedRunningTime="2026-02-24 03:12:24.877041983 +0000 UTC m=+1068.894112826" Feb 24 03:12:28 crc kubenswrapper[4923]: I0224 03:12:28.661031 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6l624" podUID="6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095" containerName="ovn-controller" probeResult="failure" output=< Feb 24 03:12:28 crc kubenswrapper[4923]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 24 03:12:28 crc kubenswrapper[4923]: > Feb 24 03:12:28 crc kubenswrapper[4923]: I0224 03:12:28.665477 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:12:32 crc kubenswrapper[4923]: I0224 03:12:32.908135 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s6tmr" event={"ID":"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0","Type":"ContainerStarted","Data":"f0411f4396fdafa783cbeff142f86afa3c7fa450917b01300c5520b5fae25405"} Feb 24 03:12:32 crc kubenswrapper[4923]: I0224 03:12:32.912620 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f","Type":"ContainerStarted","Data":"8c9c3472a82ca3f9ae6b6944160c13af95ebca27c2d308cf16c2bace6cb187ec"} Feb 24 03:12:32 crc kubenswrapper[4923]: I0224 03:12:32.912653 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f","Type":"ContainerStarted","Data":"4a93cbf27e1c3dc2da1b7856977ebc5947bda53b583a3375fb77bac688b4765c"} Feb 24 03:12:32 crc kubenswrapper[4923]: I0224 03:12:32.912666 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f","Type":"ContainerStarted","Data":"a6007ce059272e60de7b89a6d1fe6da0ce65fac5cf49babe8823d1e3613cbee1"} Feb 24 03:12:32 crc kubenswrapper[4923]: I0224 03:12:32.912677 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f","Type":"ContainerStarted","Data":"1908cd5b03ab852753b27465307c2e0952d25d52957b4565326cc287917a300c"} Feb 24 03:12:32 crc kubenswrapper[4923]: I0224 03:12:32.929168 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-s6tmr" podStartSLOduration=2.8246440870000002 podStartE2EDuration="14.929143734s" podCreationTimestamp="2026-02-24 03:12:18 +0000 UTC" firstStartedPulling="2026-02-24 03:12:19.674865202 +0000 UTC m=+1063.691936015" lastFinishedPulling="2026-02-24 03:12:31.779364829 +0000 UTC m=+1075.796435662" observedRunningTime="2026-02-24 03:12:32.926582067 +0000 UTC m=+1076.943652890" watchObservedRunningTime="2026-02-24 03:12:32.929143734 +0000 UTC m=+1076.946214557" Feb 24 03:12:33 crc kubenswrapper[4923]: I0224 03:12:33.658496 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6l624" podUID="6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095" containerName="ovn-controller" probeResult="failure" output=< Feb 24 03:12:33 crc kubenswrapper[4923]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 24 03:12:33 crc kubenswrapper[4923]: > Feb 24 03:12:33 crc kubenswrapper[4923]: I0224 03:12:33.690981 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-555wh" Feb 24 03:12:33 crc kubenswrapper[4923]: I0224 03:12:33.903406 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6l624-config-qx9ll"] Feb 24 03:12:33 crc kubenswrapper[4923]: E0224 03:12:33.904288 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16745ab2-3d14-4ee0-9385-f3f8913b865e" containerName="mariadb-account-create-update" Feb 24 03:12:33 crc kubenswrapper[4923]: I0224 03:12:33.904333 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="16745ab2-3d14-4ee0-9385-f3f8913b865e" containerName="mariadb-account-create-update" Feb 24 03:12:33 crc kubenswrapper[4923]: I0224 03:12:33.904563 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="16745ab2-3d14-4ee0-9385-f3f8913b865e" containerName="mariadb-account-create-update" Feb 24 03:12:33 crc kubenswrapper[4923]: I0224 03:12:33.905117 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:33 crc kubenswrapper[4923]: I0224 03:12:33.907668 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 24 03:12:33 crc kubenswrapper[4923]: I0224 03:12:33.913756 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8e5364a6-d471-448a-956e-5992f58b1f47-var-log-ovn\") pod \"ovn-controller-6l624-config-qx9ll\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:33 crc kubenswrapper[4923]: I0224 03:12:33.913884 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e5364a6-d471-448a-956e-5992f58b1f47-var-run\") pod \"ovn-controller-6l624-config-qx9ll\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:33 crc kubenswrapper[4923]: I0224 03:12:33.913921 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh2rw\" (UniqueName: \"kubernetes.io/projected/8e5364a6-d471-448a-956e-5992f58b1f47-kube-api-access-zh2rw\") pod \"ovn-controller-6l624-config-qx9ll\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:33 crc kubenswrapper[4923]: I0224 03:12:33.913965 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e5364a6-d471-448a-956e-5992f58b1f47-scripts\") pod \"ovn-controller-6l624-config-qx9ll\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:33 crc kubenswrapper[4923]: I0224 03:12:33.913987 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8e5364a6-d471-448a-956e-5992f58b1f47-additional-scripts\") pod \"ovn-controller-6l624-config-qx9ll\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:33 crc kubenswrapper[4923]: I0224 03:12:33.914038 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e5364a6-d471-448a-956e-5992f58b1f47-var-run-ovn\") pod \"ovn-controller-6l624-config-qx9ll\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:33 crc kubenswrapper[4923]: I0224 03:12:33.922228 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6l624-config-qx9ll"] Feb 24 03:12:33 crc kubenswrapper[4923]: I0224 03:12:33.965601 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f","Type":"ContainerStarted","Data":"c94518f7e22c11ac5b30dbfab505b8e4219d3450ec74cb22512e0f2efe79721b"} Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.015246 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e5364a6-d471-448a-956e-5992f58b1f47-scripts\") pod \"ovn-controller-6l624-config-qx9ll\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.015312 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8e5364a6-d471-448a-956e-5992f58b1f47-additional-scripts\") pod \"ovn-controller-6l624-config-qx9ll\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.015419 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e5364a6-d471-448a-956e-5992f58b1f47-var-run-ovn\") pod \"ovn-controller-6l624-config-qx9ll\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.015448 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8e5364a6-d471-448a-956e-5992f58b1f47-var-log-ovn\") pod \"ovn-controller-6l624-config-qx9ll\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.015651 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e5364a6-d471-448a-956e-5992f58b1f47-var-run\") pod \"ovn-controller-6l624-config-qx9ll\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.015704 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh2rw\" (UniqueName: \"kubernetes.io/projected/8e5364a6-d471-448a-956e-5992f58b1f47-kube-api-access-zh2rw\") pod \"ovn-controller-6l624-config-qx9ll\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.015939 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e5364a6-d471-448a-956e-5992f58b1f47-var-run-ovn\") pod \"ovn-controller-6l624-config-qx9ll\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.016008 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8e5364a6-d471-448a-956e-5992f58b1f47-var-log-ovn\") pod \"ovn-controller-6l624-config-qx9ll\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.016096 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e5364a6-d471-448a-956e-5992f58b1f47-var-run\") pod \"ovn-controller-6l624-config-qx9ll\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.016483 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8e5364a6-d471-448a-956e-5992f58b1f47-additional-scripts\") pod \"ovn-controller-6l624-config-qx9ll\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.020073 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e5364a6-d471-448a-956e-5992f58b1f47-scripts\") pod \"ovn-controller-6l624-config-qx9ll\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.034153 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh2rw\" (UniqueName: \"kubernetes.io/projected/8e5364a6-d471-448a-956e-5992f58b1f47-kube-api-access-zh2rw\") pod \"ovn-controller-6l624-config-qx9ll\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.318129 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.574911 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6l624-config-qx9ll"] Feb 24 03:12:34 crc kubenswrapper[4923]: W0224 03:12:34.575721 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e5364a6_d471_448a_956e_5992f58b1f47.slice/crio-5110ce0a0199d87d9d8cb18f1c658dd403144a944f5fa59a4b414ffe1d7fd9a9 WatchSource:0}: Error finding container 5110ce0a0199d87d9d8cb18f1c658dd403144a944f5fa59a4b414ffe1d7fd9a9: Status 404 returned error can't find the container with id 5110ce0a0199d87d9d8cb18f1c658dd403144a944f5fa59a4b414ffe1d7fd9a9 Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.978639 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6l624-config-qx9ll" event={"ID":"8e5364a6-d471-448a-956e-5992f58b1f47","Type":"ContainerStarted","Data":"402bfedafb99a5b76acdee3ff430f6d1b0dc0dcd5ce0bef31c9f6a5f1492efe0"} Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.978687 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6l624-config-qx9ll" event={"ID":"8e5364a6-d471-448a-956e-5992f58b1f47","Type":"ContainerStarted","Data":"5110ce0a0199d87d9d8cb18f1c658dd403144a944f5fa59a4b414ffe1d7fd9a9"} Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.984260 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f","Type":"ContainerStarted","Data":"a374b9285bdcd43696ebd2d188b7a14220194d8e1b16dd510cd2db55586649d1"} Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.984422 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f","Type":"ContainerStarted","Data":"a42828d220815eb99603f5a29e45467e1c4fe2eac958495380f5f930cb475640"} Feb 24 03:12:34 crc kubenswrapper[4923]: I0224 03:12:34.984490 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f","Type":"ContainerStarted","Data":"273c1d9f6eb930b03bd774e457ae6b7a92eb7a603cdedeca23ae5aaaa0a5e97d"} Feb 24 03:12:35 crc kubenswrapper[4923]: I0224 03:12:35.004685 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6l624-config-qx9ll" podStartSLOduration=2.004668606 podStartE2EDuration="2.004668606s" podCreationTimestamp="2026-02-24 03:12:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:12:35.002437727 +0000 UTC m=+1079.019508540" watchObservedRunningTime="2026-02-24 03:12:35.004668606 +0000 UTC m=+1079.021739419" Feb 24 03:12:36 crc kubenswrapper[4923]: I0224 03:12:36.005895 4923 generic.go:334] "Generic (PLEG): container finished" podID="8e5364a6-d471-448a-956e-5992f58b1f47" containerID="402bfedafb99a5b76acdee3ff430f6d1b0dc0dcd5ce0bef31c9f6a5f1492efe0" exitCode=0 Feb 24 03:12:36 crc kubenswrapper[4923]: I0224 03:12:36.006324 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6l624-config-qx9ll" event={"ID":"8e5364a6-d471-448a-956e-5992f58b1f47","Type":"ContainerDied","Data":"402bfedafb99a5b76acdee3ff430f6d1b0dc0dcd5ce0bef31c9f6a5f1492efe0"} Feb 24 03:12:36 crc kubenswrapper[4923]: I0224 03:12:36.009973 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f","Type":"ContainerStarted","Data":"4c6f2061503ddae24b489abaf394ed37115be96a4a99920c4c3246646e3e2910"} Feb 24 03:12:36 crc kubenswrapper[4923]: I0224 03:12:36.009998 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f","Type":"ContainerStarted","Data":"7b4a9855d5a4a1d4a188efc729af3ec0f5145365879ebf530337c772e82864c9"} Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.024452 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f","Type":"ContainerStarted","Data":"18a63b30b4e519176be4d909fff7c7d23b2bb8867d1e7a86eeb81bc28b8f1c46"} Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.024755 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f","Type":"ContainerStarted","Data":"40152530a3d47d9d4c93a25da52fc626e827302f7d23af0244e12dd55d3b3bd3"} Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.024764 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f","Type":"ContainerStarted","Data":"efb10bc80cdf69c62d1108435c1e10ba93c1ca06331f5cbabf5f4c0a461632b2"} Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.024774 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f","Type":"ContainerStarted","Data":"45252ecd94bb228793b71cfcbaa01003ad6e286581a7df01b3cf3220b93cde86"} Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.024782 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f","Type":"ContainerStarted","Data":"4b57e7028b0f3dcb0976120a38e012b2d67047b1369d3ac794cfa198bc76e0a3"} Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.064377 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.121272141 podStartE2EDuration="32.064357833s" podCreationTimestamp="2026-02-24 03:12:05 +0000 UTC" firstStartedPulling="2026-02-24 03:12:23.532407696 +0000 UTC m=+1067.549478509" lastFinishedPulling="2026-02-24 03:12:35.475493378 +0000 UTC m=+1079.492564201" observedRunningTime="2026-02-24 03:12:37.062971377 +0000 UTC m=+1081.080042180" watchObservedRunningTime="2026-02-24 03:12:37.064357833 +0000 UTC m=+1081.081428656" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.379862 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-fp4ks"] Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.381387 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.385870 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.392242 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-fp4ks"] Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.414116 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.478654 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-fp4ks\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.478769 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-config\") pod \"dnsmasq-dns-5c79d794d7-fp4ks\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.478823 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-fp4ks\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.478852 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw5hk\" (UniqueName: \"kubernetes.io/projected/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-kube-api-access-fw5hk\") pod \"dnsmasq-dns-5c79d794d7-fp4ks\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.478887 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-fp4ks\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.478901 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-fp4ks\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.580339 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8e5364a6-d471-448a-956e-5992f58b1f47-additional-scripts\") pod \"8e5364a6-d471-448a-956e-5992f58b1f47\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.580389 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e5364a6-d471-448a-956e-5992f58b1f47-var-run\") pod \"8e5364a6-d471-448a-956e-5992f58b1f47\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.580529 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e5364a6-d471-448a-956e-5992f58b1f47-var-run" (OuterVolumeSpecName: "var-run") pod "8e5364a6-d471-448a-956e-5992f58b1f47" (UID: "8e5364a6-d471-448a-956e-5992f58b1f47"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.580558 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e5364a6-d471-448a-956e-5992f58b1f47-scripts\") pod \"8e5364a6-d471-448a-956e-5992f58b1f47\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.580602 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh2rw\" (UniqueName: \"kubernetes.io/projected/8e5364a6-d471-448a-956e-5992f58b1f47-kube-api-access-zh2rw\") pod \"8e5364a6-d471-448a-956e-5992f58b1f47\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.580657 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e5364a6-d471-448a-956e-5992f58b1f47-var-run-ovn\") pod \"8e5364a6-d471-448a-956e-5992f58b1f47\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.580682 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8e5364a6-d471-448a-956e-5992f58b1f47-var-log-ovn\") pod \"8e5364a6-d471-448a-956e-5992f58b1f47\" (UID: \"8e5364a6-d471-448a-956e-5992f58b1f47\") " Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.580775 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e5364a6-d471-448a-956e-5992f58b1f47-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8e5364a6-d471-448a-956e-5992f58b1f47" (UID: "8e5364a6-d471-448a-956e-5992f58b1f47"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.580902 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e5364a6-d471-448a-956e-5992f58b1f47-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8e5364a6-d471-448a-956e-5992f58b1f47" (UID: "8e5364a6-d471-448a-956e-5992f58b1f47"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.580951 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-fp4ks\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.580992 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw5hk\" (UniqueName: \"kubernetes.io/projected/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-kube-api-access-fw5hk\") pod \"dnsmasq-dns-5c79d794d7-fp4ks\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.581014 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-fp4ks\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.581029 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-fp4ks\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.581062 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-fp4ks\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.581135 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-config\") pod \"dnsmasq-dns-5c79d794d7-fp4ks\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.581197 4923 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e5364a6-d471-448a-956e-5992f58b1f47-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.581207 4923 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8e5364a6-d471-448a-956e-5992f58b1f47-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.581216 4923 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e5364a6-d471-448a-956e-5992f58b1f47-var-run\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.581984 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-fp4ks\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.582039 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-fp4ks\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.582111 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-config\") pod \"dnsmasq-dns-5c79d794d7-fp4ks\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.582184 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5364a6-d471-448a-956e-5992f58b1f47-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8e5364a6-d471-448a-956e-5992f58b1f47" (UID: "8e5364a6-d471-448a-956e-5992f58b1f47"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.582331 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-fp4ks\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.582337 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5364a6-d471-448a-956e-5992f58b1f47-scripts" (OuterVolumeSpecName: "scripts") pod "8e5364a6-d471-448a-956e-5992f58b1f47" (UID: "8e5364a6-d471-448a-956e-5992f58b1f47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.582711 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-fp4ks\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.589120 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e5364a6-d471-448a-956e-5992f58b1f47-kube-api-access-zh2rw" (OuterVolumeSpecName: "kube-api-access-zh2rw") pod "8e5364a6-d471-448a-956e-5992f58b1f47" (UID: "8e5364a6-d471-448a-956e-5992f58b1f47"). InnerVolumeSpecName "kube-api-access-zh2rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.597148 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw5hk\" (UniqueName: \"kubernetes.io/projected/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-kube-api-access-fw5hk\") pod \"dnsmasq-dns-5c79d794d7-fp4ks\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.682874 4923 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8e5364a6-d471-448a-956e-5992f58b1f47-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.682918 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e5364a6-d471-448a-956e-5992f58b1f47-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.682932 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh2rw\" (UniqueName: \"kubernetes.io/projected/8e5364a6-d471-448a-956e-5992f58b1f47-kube-api-access-zh2rw\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:37 crc kubenswrapper[4923]: I0224 03:12:37.720871 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.033172 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6l624-config-qx9ll" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.033345 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6l624-config-qx9ll" event={"ID":"8e5364a6-d471-448a-956e-5992f58b1f47","Type":"ContainerDied","Data":"5110ce0a0199d87d9d8cb18f1c658dd403144a944f5fa59a4b414ffe1d7fd9a9"} Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.033986 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5110ce0a0199d87d9d8cb18f1c658dd403144a944f5fa59a4b414ffe1d7fd9a9" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.099579 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6l624-config-qx9ll"] Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.106478 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6l624-config-qx9ll"] Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.201482 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-fp4ks"] Feb 24 03:12:38 crc kubenswrapper[4923]: W0224 03:12:38.205275 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff2c8cd7_82f3_44fc_8d69_4e45d9e21b5b.slice/crio-ef58c71d7467a278bbc87830c1ef68473f3bc6143a12e121d36c7f8ba037ea9e WatchSource:0}: Error finding container ef58c71d7467a278bbc87830c1ef68473f3bc6143a12e121d36c7f8ba037ea9e: Status 404 returned error can't find the container with id ef58c71d7467a278bbc87830c1ef68473f3bc6143a12e121d36c7f8ba037ea9e Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.295244 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6l624-config-6ffhv"] Feb 24 03:12:38 crc kubenswrapper[4923]: E0224 03:12:38.295567 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e5364a6-d471-448a-956e-5992f58b1f47" containerName="ovn-config" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.295585 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5364a6-d471-448a-956e-5992f58b1f47" containerName="ovn-config" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.295746 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e5364a6-d471-448a-956e-5992f58b1f47" containerName="ovn-config" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.296246 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.312644 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.348863 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6l624-config-6ffhv"] Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.405414 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e20f79ce-df92-4668-8e31-d057734a5ce6-additional-scripts\") pod \"ovn-controller-6l624-config-6ffhv\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.405513 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e20f79ce-df92-4668-8e31-d057734a5ce6-scripts\") pod \"ovn-controller-6l624-config-6ffhv\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.405613 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e20f79ce-df92-4668-8e31-d057734a5ce6-var-run-ovn\") pod \"ovn-controller-6l624-config-6ffhv\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.405669 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e20f79ce-df92-4668-8e31-d057734a5ce6-var-run\") pod \"ovn-controller-6l624-config-6ffhv\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.405747 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f8sr\" (UniqueName: \"kubernetes.io/projected/e20f79ce-df92-4668-8e31-d057734a5ce6-kube-api-access-9f8sr\") pod \"ovn-controller-6l624-config-6ffhv\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.405855 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e20f79ce-df92-4668-8e31-d057734a5ce6-var-log-ovn\") pod \"ovn-controller-6l624-config-6ffhv\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.507519 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e20f79ce-df92-4668-8e31-d057734a5ce6-var-run\") pod \"ovn-controller-6l624-config-6ffhv\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.507853 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f8sr\" (UniqueName: \"kubernetes.io/projected/e20f79ce-df92-4668-8e31-d057734a5ce6-kube-api-access-9f8sr\") pod \"ovn-controller-6l624-config-6ffhv\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.507909 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e20f79ce-df92-4668-8e31-d057734a5ce6-var-log-ovn\") pod \"ovn-controller-6l624-config-6ffhv\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.507974 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e20f79ce-df92-4668-8e31-d057734a5ce6-additional-scripts\") pod \"ovn-controller-6l624-config-6ffhv\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.508001 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e20f79ce-df92-4668-8e31-d057734a5ce6-scripts\") pod \"ovn-controller-6l624-config-6ffhv\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.508036 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e20f79ce-df92-4668-8e31-d057734a5ce6-var-run-ovn\") pod \"ovn-controller-6l624-config-6ffhv\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.508101 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e20f79ce-df92-4668-8e31-d057734a5ce6-var-log-ovn\") pod \"ovn-controller-6l624-config-6ffhv\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.508192 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e20f79ce-df92-4668-8e31-d057734a5ce6-var-run-ovn\") pod \"ovn-controller-6l624-config-6ffhv\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.507972 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e20f79ce-df92-4668-8e31-d057734a5ce6-var-run\") pod \"ovn-controller-6l624-config-6ffhv\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.508903 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e20f79ce-df92-4668-8e31-d057734a5ce6-additional-scripts\") pod \"ovn-controller-6l624-config-6ffhv\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.510270 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e20f79ce-df92-4668-8e31-d057734a5ce6-scripts\") pod \"ovn-controller-6l624-config-6ffhv\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.523811 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f8sr\" (UniqueName: \"kubernetes.io/projected/e20f79ce-df92-4668-8e31-d057734a5ce6-kube-api-access-9f8sr\") pod \"ovn-controller-6l624-config-6ffhv\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.677019 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-6l624" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.693980 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:38 crc kubenswrapper[4923]: I0224 03:12:38.943904 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6l624-config-6ffhv"] Feb 24 03:12:39 crc kubenswrapper[4923]: I0224 03:12:39.053755 4923 generic.go:334] "Generic (PLEG): container finished" podID="ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b" containerID="0a8b05dd27413b3898cfa9da0c0a9d1b63d976cbb447edb1c4ef7b5b64ddcf15" exitCode=0 Feb 24 03:12:39 crc kubenswrapper[4923]: I0224 03:12:39.053824 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" event={"ID":"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b","Type":"ContainerDied","Data":"0a8b05dd27413b3898cfa9da0c0a9d1b63d976cbb447edb1c4ef7b5b64ddcf15"} Feb 24 03:12:39 crc kubenswrapper[4923]: I0224 03:12:39.053846 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" event={"ID":"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b","Type":"ContainerStarted","Data":"ef58c71d7467a278bbc87830c1ef68473f3bc6143a12e121d36c7f8ba037ea9e"} Feb 24 03:12:39 crc kubenswrapper[4923]: I0224 03:12:39.056365 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6l624-config-6ffhv" event={"ID":"e20f79ce-df92-4668-8e31-d057734a5ce6","Type":"ContainerStarted","Data":"883a272760009c672a895b9e485dad273bb3bf07d49d7433eb09634b14096978"} Feb 24 03:12:39 crc kubenswrapper[4923]: I0224 03:12:39.723677 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e5364a6-d471-448a-956e-5992f58b1f47" path="/var/lib/kubelet/pods/8e5364a6-d471-448a-956e-5992f58b1f47/volumes" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.064144 4923 generic.go:334] "Generic (PLEG): container finished" podID="e20f79ce-df92-4668-8e31-d057734a5ce6" containerID="fea3d74be017adec0f287e658df385cb96dac19e272b623b9689db1c0588a683" exitCode=0 Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.064226 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6l624-config-6ffhv" event={"ID":"e20f79ce-df92-4668-8e31-d057734a5ce6","Type":"ContainerDied","Data":"fea3d74be017adec0f287e658df385cb96dac19e272b623b9689db1c0588a683"} Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.066281 4923 generic.go:334] "Generic (PLEG): container finished" podID="72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0" containerID="f0411f4396fdafa783cbeff142f86afa3c7fa450917b01300c5520b5fae25405" exitCode=0 Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.066356 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s6tmr" event={"ID":"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0","Type":"ContainerDied","Data":"f0411f4396fdafa783cbeff142f86afa3c7fa450917b01300c5520b5fae25405"} Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.068525 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" event={"ID":"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b","Type":"ContainerStarted","Data":"ff329fbe5e8fe22510fc5fa1adc20cf30514354075659b6184cd149442139275"} Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.068693 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.112616 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" podStartSLOduration=3.112598915 podStartE2EDuration="3.112598915s" podCreationTimestamp="2026-02-24 03:12:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:12:40.106480524 +0000 UTC m=+1084.123551357" watchObservedRunningTime="2026-02-24 03:12:40.112598915 +0000 UTC m=+1084.129669738" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.289707 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.574138 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-gpxn4"] Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.575091 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gpxn4" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.603893 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gpxn4"] Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.619484 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.680058 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-1a60-account-create-update-9zjxd"] Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.681142 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1a60-account-create-update-9zjxd" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.683329 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.694055 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1a60-account-create-update-9zjxd"] Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.745102 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2194e053-012e-4478-aa2c-70dceb03dc7a-operator-scripts\") pod \"cinder-db-create-gpxn4\" (UID: \"2194e053-012e-4478-aa2c-70dceb03dc7a\") " pod="openstack/cinder-db-create-gpxn4" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.745180 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq8l5\" (UniqueName: \"kubernetes.io/projected/2194e053-012e-4478-aa2c-70dceb03dc7a-kube-api-access-mq8l5\") pod \"cinder-db-create-gpxn4\" (UID: \"2194e053-012e-4478-aa2c-70dceb03dc7a\") " pod="openstack/cinder-db-create-gpxn4" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.846995 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2194e053-012e-4478-aa2c-70dceb03dc7a-operator-scripts\") pod \"cinder-db-create-gpxn4\" (UID: \"2194e053-012e-4478-aa2c-70dceb03dc7a\") " pod="openstack/cinder-db-create-gpxn4" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.847060 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq8l5\" (UniqueName: \"kubernetes.io/projected/2194e053-012e-4478-aa2c-70dceb03dc7a-kube-api-access-mq8l5\") pod \"cinder-db-create-gpxn4\" (UID: \"2194e053-012e-4478-aa2c-70dceb03dc7a\") " pod="openstack/cinder-db-create-gpxn4" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.847094 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfb8e452-36b8-4359-b5ed-34499c3b4fa4-operator-scripts\") pod \"cinder-1a60-account-create-update-9zjxd\" (UID: \"bfb8e452-36b8-4359-b5ed-34499c3b4fa4\") " pod="openstack/cinder-1a60-account-create-update-9zjxd" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.847147 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5znwb\" (UniqueName: \"kubernetes.io/projected/bfb8e452-36b8-4359-b5ed-34499c3b4fa4-kube-api-access-5znwb\") pod \"cinder-1a60-account-create-update-9zjxd\" (UID: \"bfb8e452-36b8-4359-b5ed-34499c3b4fa4\") " pod="openstack/cinder-1a60-account-create-update-9zjxd" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.847869 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2194e053-012e-4478-aa2c-70dceb03dc7a-operator-scripts\") pod \"cinder-db-create-gpxn4\" (UID: \"2194e053-012e-4478-aa2c-70dceb03dc7a\") " pod="openstack/cinder-db-create-gpxn4" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.884166 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-lpjk7"] Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.885039 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-lpjk7" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.895236 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq8l5\" (UniqueName: \"kubernetes.io/projected/2194e053-012e-4478-aa2c-70dceb03dc7a-kube-api-access-mq8l5\") pod \"cinder-db-create-gpxn4\" (UID: \"2194e053-012e-4478-aa2c-70dceb03dc7a\") " pod="openstack/cinder-db-create-gpxn4" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.905223 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-lpjk7"] Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.910635 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-d178-account-create-update-pck7k"] Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.911530 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d178-account-create-update-pck7k" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.913304 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.932386 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d178-account-create-update-pck7k"] Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.955222 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5znwb\" (UniqueName: \"kubernetes.io/projected/bfb8e452-36b8-4359-b5ed-34499c3b4fa4-kube-api-access-5znwb\") pod \"cinder-1a60-account-create-update-9zjxd\" (UID: \"bfb8e452-36b8-4359-b5ed-34499c3b4fa4\") " pod="openstack/cinder-1a60-account-create-update-9zjxd" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.955370 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfb8e452-36b8-4359-b5ed-34499c3b4fa4-operator-scripts\") pod \"cinder-1a60-account-create-update-9zjxd\" (UID: \"bfb8e452-36b8-4359-b5ed-34499c3b4fa4\") " pod="openstack/cinder-1a60-account-create-update-9zjxd" Feb 24 03:12:40 crc kubenswrapper[4923]: I0224 03:12:40.956135 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfb8e452-36b8-4359-b5ed-34499c3b4fa4-operator-scripts\") pod \"cinder-1a60-account-create-update-9zjxd\" (UID: \"bfb8e452-36b8-4359-b5ed-34499c3b4fa4\") " pod="openstack/cinder-1a60-account-create-update-9zjxd" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.000925 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5znwb\" (UniqueName: \"kubernetes.io/projected/bfb8e452-36b8-4359-b5ed-34499c3b4fa4-kube-api-access-5znwb\") pod \"cinder-1a60-account-create-update-9zjxd\" (UID: \"bfb8e452-36b8-4359-b5ed-34499c3b4fa4\") " pod="openstack/cinder-1a60-account-create-update-9zjxd" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.023734 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-fjcwf"] Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.024656 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fjcwf" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.046489 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fjcwf"] Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.059349 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/813d78f2-56a0-4658-b5fd-ab17e99db899-operator-scripts\") pod \"barbican-db-create-lpjk7\" (UID: \"813d78f2-56a0-4658-b5fd-ab17e99db899\") " pod="openstack/barbican-db-create-lpjk7" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.059430 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzxg5\" (UniqueName: \"kubernetes.io/projected/813d78f2-56a0-4658-b5fd-ab17e99db899-kube-api-access-gzxg5\") pod \"barbican-db-create-lpjk7\" (UID: \"813d78f2-56a0-4658-b5fd-ab17e99db899\") " pod="openstack/barbican-db-create-lpjk7" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.059466 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wm6x\" (UniqueName: \"kubernetes.io/projected/b7328613-da6b-4e74-9857-d0d8a799c505-kube-api-access-4wm6x\") pod \"barbican-d178-account-create-update-pck7k\" (UID: \"b7328613-da6b-4e74-9857-d0d8a799c505\") " pod="openstack/barbican-d178-account-create-update-pck7k" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.059499 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7328613-da6b-4e74-9857-d0d8a799c505-operator-scripts\") pod \"barbican-d178-account-create-update-pck7k\" (UID: \"b7328613-da6b-4e74-9857-d0d8a799c505\") " pod="openstack/barbican-d178-account-create-update-pck7k" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.109871 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-cd2f-account-create-update-l8bc9"] Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.110832 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cd2f-account-create-update-l8bc9" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.117867 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.149872 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cd2f-account-create-update-l8bc9"] Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.162036 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzxg5\" (UniqueName: \"kubernetes.io/projected/813d78f2-56a0-4658-b5fd-ab17e99db899-kube-api-access-gzxg5\") pod \"barbican-db-create-lpjk7\" (UID: \"813d78f2-56a0-4658-b5fd-ab17e99db899\") " pod="openstack/barbican-db-create-lpjk7" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.162090 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33de35a6-2c83-4492-8ca5-103c030007ea-operator-scripts\") pod \"neutron-db-create-fjcwf\" (UID: \"33de35a6-2c83-4492-8ca5-103c030007ea\") " pod="openstack/neutron-db-create-fjcwf" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.162113 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wm6x\" (UniqueName: \"kubernetes.io/projected/b7328613-da6b-4e74-9857-d0d8a799c505-kube-api-access-4wm6x\") pod \"barbican-d178-account-create-update-pck7k\" (UID: \"b7328613-da6b-4e74-9857-d0d8a799c505\") " pod="openstack/barbican-d178-account-create-update-pck7k" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.162149 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7328613-da6b-4e74-9857-d0d8a799c505-operator-scripts\") pod \"barbican-d178-account-create-update-pck7k\" (UID: \"b7328613-da6b-4e74-9857-d0d8a799c505\") " pod="openstack/barbican-d178-account-create-update-pck7k" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.162205 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cndtg\" (UniqueName: \"kubernetes.io/projected/33de35a6-2c83-4492-8ca5-103c030007ea-kube-api-access-cndtg\") pod \"neutron-db-create-fjcwf\" (UID: \"33de35a6-2c83-4492-8ca5-103c030007ea\") " pod="openstack/neutron-db-create-fjcwf" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.162244 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/813d78f2-56a0-4658-b5fd-ab17e99db899-operator-scripts\") pod \"barbican-db-create-lpjk7\" (UID: \"813d78f2-56a0-4658-b5fd-ab17e99db899\") " pod="openstack/barbican-db-create-lpjk7" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.162844 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/813d78f2-56a0-4658-b5fd-ab17e99db899-operator-scripts\") pod \"barbican-db-create-lpjk7\" (UID: \"813d78f2-56a0-4658-b5fd-ab17e99db899\") " pod="openstack/barbican-db-create-lpjk7" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.163619 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7328613-da6b-4e74-9857-d0d8a799c505-operator-scripts\") pod \"barbican-d178-account-create-update-pck7k\" (UID: \"b7328613-da6b-4e74-9857-d0d8a799c505\") " pod="openstack/barbican-d178-account-create-update-pck7k" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.188698 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzxg5\" (UniqueName: \"kubernetes.io/projected/813d78f2-56a0-4658-b5fd-ab17e99db899-kube-api-access-gzxg5\") pod \"barbican-db-create-lpjk7\" (UID: \"813d78f2-56a0-4658-b5fd-ab17e99db899\") " pod="openstack/barbican-db-create-lpjk7" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.192155 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gpxn4" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.197274 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-qq8ck"] Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.198198 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qq8ck" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.202018 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-26zv5" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.204427 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.204610 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.204727 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.210850 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wm6x\" (UniqueName: \"kubernetes.io/projected/b7328613-da6b-4e74-9857-d0d8a799c505-kube-api-access-4wm6x\") pod \"barbican-d178-account-create-update-pck7k\" (UID: \"b7328613-da6b-4e74-9857-d0d8a799c505\") " pod="openstack/barbican-d178-account-create-update-pck7k" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.219568 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qq8ck"] Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.263511 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33de35a6-2c83-4492-8ca5-103c030007ea-operator-scripts\") pod \"neutron-db-create-fjcwf\" (UID: \"33de35a6-2c83-4492-8ca5-103c030007ea\") " pod="openstack/neutron-db-create-fjcwf" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.263718 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9bfw\" (UniqueName: \"kubernetes.io/projected/02b56c32-f855-4c43-a006-c546a59e977f-kube-api-access-l9bfw\") pod \"neutron-cd2f-account-create-update-l8bc9\" (UID: \"02b56c32-f855-4c43-a006-c546a59e977f\") " pod="openstack/neutron-cd2f-account-create-update-l8bc9" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.263755 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cndtg\" (UniqueName: \"kubernetes.io/projected/33de35a6-2c83-4492-8ca5-103c030007ea-kube-api-access-cndtg\") pod \"neutron-db-create-fjcwf\" (UID: \"33de35a6-2c83-4492-8ca5-103c030007ea\") " pod="openstack/neutron-db-create-fjcwf" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.263817 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02b56c32-f855-4c43-a006-c546a59e977f-operator-scripts\") pod \"neutron-cd2f-account-create-update-l8bc9\" (UID: \"02b56c32-f855-4c43-a006-c546a59e977f\") " pod="openstack/neutron-cd2f-account-create-update-l8bc9" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.264665 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33de35a6-2c83-4492-8ca5-103c030007ea-operator-scripts\") pod \"neutron-db-create-fjcwf\" (UID: \"33de35a6-2c83-4492-8ca5-103c030007ea\") " pod="openstack/neutron-db-create-fjcwf" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.270029 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-lpjk7" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.281774 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cndtg\" (UniqueName: \"kubernetes.io/projected/33de35a6-2c83-4492-8ca5-103c030007ea-kube-api-access-cndtg\") pod \"neutron-db-create-fjcwf\" (UID: \"33de35a6-2c83-4492-8ca5-103c030007ea\") " pod="openstack/neutron-db-create-fjcwf" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.293411 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d178-account-create-update-pck7k" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.295246 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1a60-account-create-update-9zjxd" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.365452 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9bfw\" (UniqueName: \"kubernetes.io/projected/02b56c32-f855-4c43-a006-c546a59e977f-kube-api-access-l9bfw\") pod \"neutron-cd2f-account-create-update-l8bc9\" (UID: \"02b56c32-f855-4c43-a006-c546a59e977f\") " pod="openstack/neutron-cd2f-account-create-update-l8bc9" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.366142 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b6f21a-18fb-4c73-8286-32f1def45bac-config-data\") pod \"keystone-db-sync-qq8ck\" (UID: \"49b6f21a-18fb-4c73-8286-32f1def45bac\") " pod="openstack/keystone-db-sync-qq8ck" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.366525 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhjln\" (UniqueName: \"kubernetes.io/projected/49b6f21a-18fb-4c73-8286-32f1def45bac-kube-api-access-vhjln\") pod \"keystone-db-sync-qq8ck\" (UID: \"49b6f21a-18fb-4c73-8286-32f1def45bac\") " pod="openstack/keystone-db-sync-qq8ck" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.366594 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02b56c32-f855-4c43-a006-c546a59e977f-operator-scripts\") pod \"neutron-cd2f-account-create-update-l8bc9\" (UID: \"02b56c32-f855-4c43-a006-c546a59e977f\") " pod="openstack/neutron-cd2f-account-create-update-l8bc9" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.366673 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b6f21a-18fb-4c73-8286-32f1def45bac-combined-ca-bundle\") pod \"keystone-db-sync-qq8ck\" (UID: \"49b6f21a-18fb-4c73-8286-32f1def45bac\") " pod="openstack/keystone-db-sync-qq8ck" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.367692 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02b56c32-f855-4c43-a006-c546a59e977f-operator-scripts\") pod \"neutron-cd2f-account-create-update-l8bc9\" (UID: \"02b56c32-f855-4c43-a006-c546a59e977f\") " pod="openstack/neutron-cd2f-account-create-update-l8bc9" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.385867 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9bfw\" (UniqueName: \"kubernetes.io/projected/02b56c32-f855-4c43-a006-c546a59e977f-kube-api-access-l9bfw\") pod \"neutron-cd2f-account-create-update-l8bc9\" (UID: \"02b56c32-f855-4c43-a006-c546a59e977f\") " pod="openstack/neutron-cd2f-account-create-update-l8bc9" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.401686 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fjcwf" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.434425 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cd2f-account-create-update-l8bc9" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.468360 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhjln\" (UniqueName: \"kubernetes.io/projected/49b6f21a-18fb-4c73-8286-32f1def45bac-kube-api-access-vhjln\") pod \"keystone-db-sync-qq8ck\" (UID: \"49b6f21a-18fb-4c73-8286-32f1def45bac\") " pod="openstack/keystone-db-sync-qq8ck" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.468498 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b6f21a-18fb-4c73-8286-32f1def45bac-combined-ca-bundle\") pod \"keystone-db-sync-qq8ck\" (UID: \"49b6f21a-18fb-4c73-8286-32f1def45bac\") " pod="openstack/keystone-db-sync-qq8ck" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.468580 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b6f21a-18fb-4c73-8286-32f1def45bac-config-data\") pod \"keystone-db-sync-qq8ck\" (UID: \"49b6f21a-18fb-4c73-8286-32f1def45bac\") " pod="openstack/keystone-db-sync-qq8ck" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.474058 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b6f21a-18fb-4c73-8286-32f1def45bac-combined-ca-bundle\") pod \"keystone-db-sync-qq8ck\" (UID: \"49b6f21a-18fb-4c73-8286-32f1def45bac\") " pod="openstack/keystone-db-sync-qq8ck" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.474079 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b6f21a-18fb-4c73-8286-32f1def45bac-config-data\") pod \"keystone-db-sync-qq8ck\" (UID: \"49b6f21a-18fb-4c73-8286-32f1def45bac\") " pod="openstack/keystone-db-sync-qq8ck" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.482811 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.488879 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhjln\" (UniqueName: \"kubernetes.io/projected/49b6f21a-18fb-4c73-8286-32f1def45bac-kube-api-access-vhjln\") pod \"keystone-db-sync-qq8ck\" (UID: \"49b6f21a-18fb-4c73-8286-32f1def45bac\") " pod="openstack/keystone-db-sync-qq8ck" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.617671 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qq8ck" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.671394 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f8sr\" (UniqueName: \"kubernetes.io/projected/e20f79ce-df92-4668-8e31-d057734a5ce6-kube-api-access-9f8sr\") pod \"e20f79ce-df92-4668-8e31-d057734a5ce6\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.671500 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e20f79ce-df92-4668-8e31-d057734a5ce6-var-log-ovn\") pod \"e20f79ce-df92-4668-8e31-d057734a5ce6\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.671557 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e20f79ce-df92-4668-8e31-d057734a5ce6-scripts\") pod \"e20f79ce-df92-4668-8e31-d057734a5ce6\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.671599 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e20f79ce-df92-4668-8e31-d057734a5ce6-var-run\") pod \"e20f79ce-df92-4668-8e31-d057734a5ce6\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.671638 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e20f79ce-df92-4668-8e31-d057734a5ce6-var-run-ovn\") pod \"e20f79ce-df92-4668-8e31-d057734a5ce6\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.671678 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e20f79ce-df92-4668-8e31-d057734a5ce6-additional-scripts\") pod \"e20f79ce-df92-4668-8e31-d057734a5ce6\" (UID: \"e20f79ce-df92-4668-8e31-d057734a5ce6\") " Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.673137 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e20f79ce-df92-4668-8e31-d057734a5ce6-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e20f79ce-df92-4668-8e31-d057734a5ce6" (UID: "e20f79ce-df92-4668-8e31-d057734a5ce6"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.673183 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e20f79ce-df92-4668-8e31-d057734a5ce6-var-run" (OuterVolumeSpecName: "var-run") pod "e20f79ce-df92-4668-8e31-d057734a5ce6" (UID: "e20f79ce-df92-4668-8e31-d057734a5ce6"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.673206 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e20f79ce-df92-4668-8e31-d057734a5ce6-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e20f79ce-df92-4668-8e31-d057734a5ce6" (UID: "e20f79ce-df92-4668-8e31-d057734a5ce6"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.673252 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e20f79ce-df92-4668-8e31-d057734a5ce6-scripts" (OuterVolumeSpecName: "scripts") pod "e20f79ce-df92-4668-8e31-d057734a5ce6" (UID: "e20f79ce-df92-4668-8e31-d057734a5ce6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.673315 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e20f79ce-df92-4668-8e31-d057734a5ce6-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e20f79ce-df92-4668-8e31-d057734a5ce6" (UID: "e20f79ce-df92-4668-8e31-d057734a5ce6"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.679783 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e20f79ce-df92-4668-8e31-d057734a5ce6-kube-api-access-9f8sr" (OuterVolumeSpecName: "kube-api-access-9f8sr") pod "e20f79ce-df92-4668-8e31-d057734a5ce6" (UID: "e20f79ce-df92-4668-8e31-d057734a5ce6"). InnerVolumeSpecName "kube-api-access-9f8sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.691145 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s6tmr" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.775038 4923 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e20f79ce-df92-4668-8e31-d057734a5ce6-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.775077 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e20f79ce-df92-4668-8e31-d057734a5ce6-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.775091 4923 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e20f79ce-df92-4668-8e31-d057734a5ce6-var-run\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.775105 4923 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e20f79ce-df92-4668-8e31-d057734a5ce6-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.775119 4923 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e20f79ce-df92-4668-8e31-d057734a5ce6-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.775131 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f8sr\" (UniqueName: \"kubernetes.io/projected/e20f79ce-df92-4668-8e31-d057734a5ce6-kube-api-access-9f8sr\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:41 crc kubenswrapper[4923]: W0224 03:12:41.803549 4923 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode20f79ce_df92_4668_8e31_d057734a5ce6.slice/memory.min": read /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode20f79ce_df92_4668_8e31_d057734a5ce6.slice/memory.min: no such device Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.857337 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-lpjk7"] Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.875667 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4f7z\" (UniqueName: \"kubernetes.io/projected/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-kube-api-access-w4f7z\") pod \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\" (UID: \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\") " Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.875746 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-config-data\") pod \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\" (UID: \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\") " Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.875800 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-db-sync-config-data\") pod \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\" (UID: \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\") " Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.875911 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-combined-ca-bundle\") pod \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\" (UID: \"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0\") " Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.881042 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0" (UID: "72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.882689 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-kube-api-access-w4f7z" (OuterVolumeSpecName: "kube-api-access-w4f7z") pod "72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0" (UID: "72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0"). InnerVolumeSpecName "kube-api-access-w4f7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:41 crc kubenswrapper[4923]: W0224 03:12:41.888070 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2194e053_012e_4478_aa2c_70dceb03dc7a.slice/crio-1732525a85793a0ea106e4b7f7535322e80076bc916f08d3c0795af5c149e1eb WatchSource:0}: Error finding container 1732525a85793a0ea106e4b7f7535322e80076bc916f08d3c0795af5c149e1eb: Status 404 returned error can't find the container with id 1732525a85793a0ea106e4b7f7535322e80076bc916f08d3c0795af5c149e1eb Feb 24 03:12:41 crc kubenswrapper[4923]: E0224 03:12:41.899596 4923 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode20f79ce_df92_4668_8e31_d057734a5ce6.slice/crio-883a272760009c672a895b9e485dad273bb3bf07d49d7433eb09634b14096978\": RecentStats: unable to find data in memory cache]" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.900834 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gpxn4"] Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.905810 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0" (UID: "72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.941671 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-config-data" (OuterVolumeSpecName: "config-data") pod "72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0" (UID: "72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.978144 4923 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.978184 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.978194 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4f7z\" (UniqueName: \"kubernetes.io/projected/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-kube-api-access-w4f7z\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:41 crc kubenswrapper[4923]: I0224 03:12:41.978206 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.060078 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1a60-account-create-update-9zjxd"] Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.127531 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1a60-account-create-update-9zjxd" event={"ID":"bfb8e452-36b8-4359-b5ed-34499c3b4fa4","Type":"ContainerStarted","Data":"d92aa36b2039a3e2478ebb0921d87ffe8bbfaf25105543cb10c33d69a718e713"} Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.133400 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fjcwf"] Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.137940 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-lpjk7" event={"ID":"813d78f2-56a0-4658-b5fd-ab17e99db899","Type":"ContainerStarted","Data":"dd628193f6452a0393a61ce273d68a3f6a7a196a6a3085666eba9718cfcd7738"} Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.139207 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gpxn4" event={"ID":"2194e053-012e-4478-aa2c-70dceb03dc7a","Type":"ContainerStarted","Data":"1732525a85793a0ea106e4b7f7535322e80076bc916f08d3c0795af5c149e1eb"} Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.140589 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6l624-config-6ffhv" event={"ID":"e20f79ce-df92-4668-8e31-d057734a5ce6","Type":"ContainerDied","Data":"883a272760009c672a895b9e485dad273bb3bf07d49d7433eb09634b14096978"} Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.140619 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="883a272760009c672a895b9e485dad273bb3bf07d49d7433eb09634b14096978" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.140699 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6l624-config-6ffhv" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.142508 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d178-account-create-update-pck7k"] Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.145145 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s6tmr" event={"ID":"72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0","Type":"ContainerDied","Data":"bb7dfc86731afea6f3d3cc61c68c9b327924f74fce34d1a8241b2b58be23cb0c"} Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.145180 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb7dfc86731afea6f3d3cc61c68c9b327924f74fce34d1a8241b2b58be23cb0c" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.145226 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s6tmr" Feb 24 03:12:42 crc kubenswrapper[4923]: W0224 03:12:42.156133 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7328613_da6b_4e74_9857_d0d8a799c505.slice/crio-643ee6fb8a3f8291a0b0e668fff3e14120d1bbd8bc11833bc93b0929c3b94415 WatchSource:0}: Error finding container 643ee6fb8a3f8291a0b0e668fff3e14120d1bbd8bc11833bc93b0929c3b94415: Status 404 returned error can't find the container with id 643ee6fb8a3f8291a0b0e668fff3e14120d1bbd8bc11833bc93b0929c3b94415 Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.233704 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cd2f-account-create-update-l8bc9"] Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.306202 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qq8ck"] Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.528170 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-fp4ks"] Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.528597 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" podUID="ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b" containerName="dnsmasq-dns" containerID="cri-o://ff329fbe5e8fe22510fc5fa1adc20cf30514354075659b6184cd149442139275" gracePeriod=10 Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.570181 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-frjd7"] Feb 24 03:12:42 crc kubenswrapper[4923]: E0224 03:12:42.571080 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0" containerName="glance-db-sync" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.571153 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0" containerName="glance-db-sync" Feb 24 03:12:42 crc kubenswrapper[4923]: E0224 03:12:42.571233 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e20f79ce-df92-4668-8e31-d057734a5ce6" containerName="ovn-config" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.571288 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="e20f79ce-df92-4668-8e31-d057734a5ce6" containerName="ovn-config" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.571553 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0" containerName="glance-db-sync" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.572497 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="e20f79ce-df92-4668-8e31-d057734a5ce6" containerName="ovn-config" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.574146 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.602972 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-frjd7"] Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.633255 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6l624-config-6ffhv"] Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.648266 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6l624-config-6ffhv"] Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.689579 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-frjd7\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.689702 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-config\") pod \"dnsmasq-dns-5f59b8f679-frjd7\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.689734 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-frjd7\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.689795 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-frjd7\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.689869 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wk7v\" (UniqueName: \"kubernetes.io/projected/9c060758-83bf-4a7c-90b4-13c2a20194b3-kube-api-access-4wk7v\") pod \"dnsmasq-dns-5f59b8f679-frjd7\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.689903 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-frjd7\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.791800 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wk7v\" (UniqueName: \"kubernetes.io/projected/9c060758-83bf-4a7c-90b4-13c2a20194b3-kube-api-access-4wk7v\") pod \"dnsmasq-dns-5f59b8f679-frjd7\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.791869 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-frjd7\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.791935 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-frjd7\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.792005 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-config\") pod \"dnsmasq-dns-5f59b8f679-frjd7\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.792029 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-frjd7\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.792060 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-frjd7\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.793075 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-frjd7\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.793985 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-config\") pod \"dnsmasq-dns-5f59b8f679-frjd7\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.794021 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-frjd7\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.794480 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-frjd7\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.794933 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-frjd7\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.819350 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wk7v\" (UniqueName: \"kubernetes.io/projected/9c060758-83bf-4a7c-90b4-13c2a20194b3-kube-api-access-4wk7v\") pod \"dnsmasq-dns-5f59b8f679-frjd7\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:42 crc kubenswrapper[4923]: I0224 03:12:42.923768 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.180367 4923 generic.go:334] "Generic (PLEG): container finished" podID="33de35a6-2c83-4492-8ca5-103c030007ea" containerID="b2296e9fb1a8750cad9c0817f5ef39ba4dbe840ed0490c45e1d95fa96444b076" exitCode=0 Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.180713 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fjcwf" event={"ID":"33de35a6-2c83-4492-8ca5-103c030007ea","Type":"ContainerDied","Data":"b2296e9fb1a8750cad9c0817f5ef39ba4dbe840ed0490c45e1d95fa96444b076"} Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.180772 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fjcwf" event={"ID":"33de35a6-2c83-4492-8ca5-103c030007ea","Type":"ContainerStarted","Data":"5f99ff967312aa25361dd35d69a234c8c1e6a7d35c84c0d9472fe404e8d6b082"} Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.187626 4923 generic.go:334] "Generic (PLEG): container finished" podID="bfb8e452-36b8-4359-b5ed-34499c3b4fa4" containerID="5a56240bfb14e5676d37cfad63c22505a75168bd7630cd8fbd3d0dc237a79ea6" exitCode=0 Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.187723 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1a60-account-create-update-9zjxd" event={"ID":"bfb8e452-36b8-4359-b5ed-34499c3b4fa4","Type":"ContainerDied","Data":"5a56240bfb14e5676d37cfad63c22505a75168bd7630cd8fbd3d0dc237a79ea6"} Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.191532 4923 generic.go:334] "Generic (PLEG): container finished" podID="813d78f2-56a0-4658-b5fd-ab17e99db899" containerID="50a88ed7328adc04b18a6a4b686a6385c3e4a6466c70a598270a5448168c54dd" exitCode=0 Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.191684 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-lpjk7" event={"ID":"813d78f2-56a0-4658-b5fd-ab17e99db899","Type":"ContainerDied","Data":"50a88ed7328adc04b18a6a4b686a6385c3e4a6466c70a598270a5448168c54dd"} Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.198579 4923 generic.go:334] "Generic (PLEG): container finished" podID="2194e053-012e-4478-aa2c-70dceb03dc7a" containerID="9a57243cfbd13d04565d801702db8cce47d931c3215f3134c3a0f4cb9e99d173" exitCode=0 Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.198643 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gpxn4" event={"ID":"2194e053-012e-4478-aa2c-70dceb03dc7a","Type":"ContainerDied","Data":"9a57243cfbd13d04565d801702db8cce47d931c3215f3134c3a0f4cb9e99d173"} Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.205819 4923 generic.go:334] "Generic (PLEG): container finished" podID="ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b" containerID="ff329fbe5e8fe22510fc5fa1adc20cf30514354075659b6184cd149442139275" exitCode=0 Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.205894 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" event={"ID":"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b","Type":"ContainerDied","Data":"ff329fbe5e8fe22510fc5fa1adc20cf30514354075659b6184cd149442139275"} Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.205935 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" event={"ID":"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b","Type":"ContainerDied","Data":"ef58c71d7467a278bbc87830c1ef68473f3bc6143a12e121d36c7f8ba037ea9e"} Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.205949 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef58c71d7467a278bbc87830c1ef68473f3bc6143a12e121d36c7f8ba037ea9e" Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.212612 4923 generic.go:334] "Generic (PLEG): container finished" podID="b7328613-da6b-4e74-9857-d0d8a799c505" containerID="d04673a161d50866cd3470d985fb7ff18c49c4cd31b1193ba6059fb111199c83" exitCode=0 Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.212687 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d178-account-create-update-pck7k" event={"ID":"b7328613-da6b-4e74-9857-d0d8a799c505","Type":"ContainerDied","Data":"d04673a161d50866cd3470d985fb7ff18c49c4cd31b1193ba6059fb111199c83"} Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.212712 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d178-account-create-update-pck7k" event={"ID":"b7328613-da6b-4e74-9857-d0d8a799c505","Type":"ContainerStarted","Data":"643ee6fb8a3f8291a0b0e668fff3e14120d1bbd8bc11833bc93b0929c3b94415"} Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.217008 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qq8ck" event={"ID":"49b6f21a-18fb-4c73-8286-32f1def45bac","Type":"ContainerStarted","Data":"e3de9cb455cf11fb212dfb18d157434ad13aaeec971056819beacb231b5f4a5e"} Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.218596 4923 generic.go:334] "Generic (PLEG): container finished" podID="02b56c32-f855-4c43-a006-c546a59e977f" containerID="0614963ad2e1c963986cba6667764f2345d8c4c2d13c882a196321835b8be31e" exitCode=0 Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.218644 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd2f-account-create-update-l8bc9" event={"ID":"02b56c32-f855-4c43-a006-c546a59e977f","Type":"ContainerDied","Data":"0614963ad2e1c963986cba6667764f2345d8c4c2d13c882a196321835b8be31e"} Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.218668 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd2f-account-create-update-l8bc9" event={"ID":"02b56c32-f855-4c43-a006-c546a59e977f","Type":"ContainerStarted","Data":"cc279e6b27c9688cc3dfd905d2643a1a0b6f36394c2ec50e43bf672483329fb3"} Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.238601 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.378974 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-frjd7"] Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.410642 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-config\") pod \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.410713 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw5hk\" (UniqueName: \"kubernetes.io/projected/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-kube-api-access-fw5hk\") pod \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.410761 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-dns-svc\") pod \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.410778 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-ovsdbserver-nb\") pod \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.410814 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-dns-swift-storage-0\") pod \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.410919 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-ovsdbserver-sb\") pod \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\" (UID: \"ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b\") " Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.418483 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-kube-api-access-fw5hk" (OuterVolumeSpecName: "kube-api-access-fw5hk") pod "ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b" (UID: "ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b"). InnerVolumeSpecName "kube-api-access-fw5hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.462316 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b" (UID: "ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.478791 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b" (UID: "ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.478814 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b" (UID: "ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.478823 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-config" (OuterVolumeSpecName: "config") pod "ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b" (UID: "ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.480866 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b" (UID: "ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.512989 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw5hk\" (UniqueName: \"kubernetes.io/projected/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-kube-api-access-fw5hk\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.513024 4923 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.513035 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.513044 4923 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.513052 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.513061 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:43 crc kubenswrapper[4923]: I0224 03:12:43.735324 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e20f79ce-df92-4668-8e31-d057734a5ce6" path="/var/lib/kubelet/pods/e20f79ce-df92-4668-8e31-d057734a5ce6/volumes" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.235235 4923 generic.go:334] "Generic (PLEG): container finished" podID="9c060758-83bf-4a7c-90b4-13c2a20194b3" containerID="dc78fc3ab389062e12dbb82c0becb861e0e72f87f4c96426437740eb849e38ee" exitCode=0 Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.235849 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" event={"ID":"9c060758-83bf-4a7c-90b4-13c2a20194b3","Type":"ContainerDied","Data":"dc78fc3ab389062e12dbb82c0becb861e0e72f87f4c96426437740eb849e38ee"} Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.235876 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" event={"ID":"9c060758-83bf-4a7c-90b4-13c2a20194b3","Type":"ContainerStarted","Data":"700c6f1f5762c48841fb04db345c7950ccb585a61dd1e37a5659ac0aa0b5334c"} Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.236064 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-fp4ks" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.423865 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-fp4ks"] Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.437605 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-fp4ks"] Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.562526 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-lpjk7" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.721450 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d178-account-create-update-pck7k" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.732114 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/813d78f2-56a0-4658-b5fd-ab17e99db899-operator-scripts\") pod \"813d78f2-56a0-4658-b5fd-ab17e99db899\" (UID: \"813d78f2-56a0-4658-b5fd-ab17e99db899\") " Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.732163 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzxg5\" (UniqueName: \"kubernetes.io/projected/813d78f2-56a0-4658-b5fd-ab17e99db899-kube-api-access-gzxg5\") pod \"813d78f2-56a0-4658-b5fd-ab17e99db899\" (UID: \"813d78f2-56a0-4658-b5fd-ab17e99db899\") " Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.732876 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/813d78f2-56a0-4658-b5fd-ab17e99db899-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "813d78f2-56a0-4658-b5fd-ab17e99db899" (UID: "813d78f2-56a0-4658-b5fd-ab17e99db899"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.738118 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fjcwf" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.742408 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813d78f2-56a0-4658-b5fd-ab17e99db899-kube-api-access-gzxg5" (OuterVolumeSpecName: "kube-api-access-gzxg5") pod "813d78f2-56a0-4658-b5fd-ab17e99db899" (UID: "813d78f2-56a0-4658-b5fd-ab17e99db899"). InnerVolumeSpecName "kube-api-access-gzxg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.754958 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gpxn4" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.756585 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cd2f-account-create-update-l8bc9" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.779280 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1a60-account-create-update-9zjxd" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.837599 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7328613-da6b-4e74-9857-d0d8a799c505-operator-scripts\") pod \"b7328613-da6b-4e74-9857-d0d8a799c505\" (UID: \"b7328613-da6b-4e74-9857-d0d8a799c505\") " Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.837707 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cndtg\" (UniqueName: \"kubernetes.io/projected/33de35a6-2c83-4492-8ca5-103c030007ea-kube-api-access-cndtg\") pod \"33de35a6-2c83-4492-8ca5-103c030007ea\" (UID: \"33de35a6-2c83-4492-8ca5-103c030007ea\") " Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.837739 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wm6x\" (UniqueName: \"kubernetes.io/projected/b7328613-da6b-4e74-9857-d0d8a799c505-kube-api-access-4wm6x\") pod \"b7328613-da6b-4e74-9857-d0d8a799c505\" (UID: \"b7328613-da6b-4e74-9857-d0d8a799c505\") " Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.837792 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33de35a6-2c83-4492-8ca5-103c030007ea-operator-scripts\") pod \"33de35a6-2c83-4492-8ca5-103c030007ea\" (UID: \"33de35a6-2c83-4492-8ca5-103c030007ea\") " Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.838246 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/813d78f2-56a0-4658-b5fd-ab17e99db899-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.838263 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzxg5\" (UniqueName: \"kubernetes.io/projected/813d78f2-56a0-4658-b5fd-ab17e99db899-kube-api-access-gzxg5\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.838764 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7328613-da6b-4e74-9857-d0d8a799c505-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b7328613-da6b-4e74-9857-d0d8a799c505" (UID: "b7328613-da6b-4e74-9857-d0d8a799c505"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.838867 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33de35a6-2c83-4492-8ca5-103c030007ea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33de35a6-2c83-4492-8ca5-103c030007ea" (UID: "33de35a6-2c83-4492-8ca5-103c030007ea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.840928 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7328613-da6b-4e74-9857-d0d8a799c505-kube-api-access-4wm6x" (OuterVolumeSpecName: "kube-api-access-4wm6x") pod "b7328613-da6b-4e74-9857-d0d8a799c505" (UID: "b7328613-da6b-4e74-9857-d0d8a799c505"). InnerVolumeSpecName "kube-api-access-4wm6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.842065 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33de35a6-2c83-4492-8ca5-103c030007ea-kube-api-access-cndtg" (OuterVolumeSpecName: "kube-api-access-cndtg") pod "33de35a6-2c83-4492-8ca5-103c030007ea" (UID: "33de35a6-2c83-4492-8ca5-103c030007ea"). InnerVolumeSpecName "kube-api-access-cndtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.939585 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfb8e452-36b8-4359-b5ed-34499c3b4fa4-operator-scripts\") pod \"bfb8e452-36b8-4359-b5ed-34499c3b4fa4\" (UID: \"bfb8e452-36b8-4359-b5ed-34499c3b4fa4\") " Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.939663 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9bfw\" (UniqueName: \"kubernetes.io/projected/02b56c32-f855-4c43-a006-c546a59e977f-kube-api-access-l9bfw\") pod \"02b56c32-f855-4c43-a006-c546a59e977f\" (UID: \"02b56c32-f855-4c43-a006-c546a59e977f\") " Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.939762 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq8l5\" (UniqueName: \"kubernetes.io/projected/2194e053-012e-4478-aa2c-70dceb03dc7a-kube-api-access-mq8l5\") pod \"2194e053-012e-4478-aa2c-70dceb03dc7a\" (UID: \"2194e053-012e-4478-aa2c-70dceb03dc7a\") " Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.939800 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02b56c32-f855-4c43-a006-c546a59e977f-operator-scripts\") pod \"02b56c32-f855-4c43-a006-c546a59e977f\" (UID: \"02b56c32-f855-4c43-a006-c546a59e977f\") " Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.939822 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2194e053-012e-4478-aa2c-70dceb03dc7a-operator-scripts\") pod \"2194e053-012e-4478-aa2c-70dceb03dc7a\" (UID: \"2194e053-012e-4478-aa2c-70dceb03dc7a\") " Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.939847 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5znwb\" (UniqueName: \"kubernetes.io/projected/bfb8e452-36b8-4359-b5ed-34499c3b4fa4-kube-api-access-5znwb\") pod \"bfb8e452-36b8-4359-b5ed-34499c3b4fa4\" (UID: \"bfb8e452-36b8-4359-b5ed-34499c3b4fa4\") " Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.940221 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7328613-da6b-4e74-9857-d0d8a799c505-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.940243 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cndtg\" (UniqueName: \"kubernetes.io/projected/33de35a6-2c83-4492-8ca5-103c030007ea-kube-api-access-cndtg\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.940253 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wm6x\" (UniqueName: \"kubernetes.io/projected/b7328613-da6b-4e74-9857-d0d8a799c505-kube-api-access-4wm6x\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.940263 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33de35a6-2c83-4492-8ca5-103c030007ea-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.941068 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02b56c32-f855-4c43-a006-c546a59e977f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "02b56c32-f855-4c43-a006-c546a59e977f" (UID: "02b56c32-f855-4c43-a006-c546a59e977f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.941318 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2194e053-012e-4478-aa2c-70dceb03dc7a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2194e053-012e-4478-aa2c-70dceb03dc7a" (UID: "2194e053-012e-4478-aa2c-70dceb03dc7a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.941518 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb8e452-36b8-4359-b5ed-34499c3b4fa4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bfb8e452-36b8-4359-b5ed-34499c3b4fa4" (UID: "bfb8e452-36b8-4359-b5ed-34499c3b4fa4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.944026 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb8e452-36b8-4359-b5ed-34499c3b4fa4-kube-api-access-5znwb" (OuterVolumeSpecName: "kube-api-access-5znwb") pod "bfb8e452-36b8-4359-b5ed-34499c3b4fa4" (UID: "bfb8e452-36b8-4359-b5ed-34499c3b4fa4"). InnerVolumeSpecName "kube-api-access-5znwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.944088 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2194e053-012e-4478-aa2c-70dceb03dc7a-kube-api-access-mq8l5" (OuterVolumeSpecName: "kube-api-access-mq8l5") pod "2194e053-012e-4478-aa2c-70dceb03dc7a" (UID: "2194e053-012e-4478-aa2c-70dceb03dc7a"). InnerVolumeSpecName "kube-api-access-mq8l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:44 crc kubenswrapper[4923]: I0224 03:12:44.944610 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02b56c32-f855-4c43-a006-c546a59e977f-kube-api-access-l9bfw" (OuterVolumeSpecName: "kube-api-access-l9bfw") pod "02b56c32-f855-4c43-a006-c546a59e977f" (UID: "02b56c32-f855-4c43-a006-c546a59e977f"). InnerVolumeSpecName "kube-api-access-l9bfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.042208 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfb8e452-36b8-4359-b5ed-34499c3b4fa4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.042451 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9bfw\" (UniqueName: \"kubernetes.io/projected/02b56c32-f855-4c43-a006-c546a59e977f-kube-api-access-l9bfw\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.042561 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq8l5\" (UniqueName: \"kubernetes.io/projected/2194e053-012e-4478-aa2c-70dceb03dc7a-kube-api-access-mq8l5\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.042639 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02b56c32-f855-4c43-a006-c546a59e977f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.042712 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2194e053-012e-4478-aa2c-70dceb03dc7a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.042783 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5znwb\" (UniqueName: \"kubernetes.io/projected/bfb8e452-36b8-4359-b5ed-34499c3b4fa4-kube-api-access-5znwb\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.245582 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cd2f-account-create-update-l8bc9" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.245588 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd2f-account-create-update-l8bc9" event={"ID":"02b56c32-f855-4c43-a006-c546a59e977f","Type":"ContainerDied","Data":"cc279e6b27c9688cc3dfd905d2643a1a0b6f36394c2ec50e43bf672483329fb3"} Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.245636 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc279e6b27c9688cc3dfd905d2643a1a0b6f36394c2ec50e43bf672483329fb3" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.247088 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" event={"ID":"9c060758-83bf-4a7c-90b4-13c2a20194b3","Type":"ContainerStarted","Data":"0f6191a8718c2bb6f45afbfa65d8a1e10252cc87287859ec22708178def8f5a9"} Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.247201 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.248762 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fjcwf" event={"ID":"33de35a6-2c83-4492-8ca5-103c030007ea","Type":"ContainerDied","Data":"5f99ff967312aa25361dd35d69a234c8c1e6a7d35c84c0d9472fe404e8d6b082"} Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.248792 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f99ff967312aa25361dd35d69a234c8c1e6a7d35c84c0d9472fe404e8d6b082" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.248838 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fjcwf" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.250433 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1a60-account-create-update-9zjxd" event={"ID":"bfb8e452-36b8-4359-b5ed-34499c3b4fa4","Type":"ContainerDied","Data":"d92aa36b2039a3e2478ebb0921d87ffe8bbfaf25105543cb10c33d69a718e713"} Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.250458 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d92aa36b2039a3e2478ebb0921d87ffe8bbfaf25105543cb10c33d69a718e713" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.250506 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1a60-account-create-update-9zjxd" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.252112 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-lpjk7" event={"ID":"813d78f2-56a0-4658-b5fd-ab17e99db899","Type":"ContainerDied","Data":"dd628193f6452a0393a61ce273d68a3f6a7a196a6a3085666eba9718cfcd7738"} Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.252133 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-lpjk7" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.252142 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd628193f6452a0393a61ce273d68a3f6a7a196a6a3085666eba9718cfcd7738" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.254498 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gpxn4" event={"ID":"2194e053-012e-4478-aa2c-70dceb03dc7a","Type":"ContainerDied","Data":"1732525a85793a0ea106e4b7f7535322e80076bc916f08d3c0795af5c149e1eb"} Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.254524 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1732525a85793a0ea106e4b7f7535322e80076bc916f08d3c0795af5c149e1eb" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.254569 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gpxn4" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.271165 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d178-account-create-update-pck7k" event={"ID":"b7328613-da6b-4e74-9857-d0d8a799c505","Type":"ContainerDied","Data":"643ee6fb8a3f8291a0b0e668fff3e14120d1bbd8bc11833bc93b0929c3b94415"} Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.271208 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="643ee6fb8a3f8291a0b0e668fff3e14120d1bbd8bc11833bc93b0929c3b94415" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.271275 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d178-account-create-update-pck7k" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.272958 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" podStartSLOduration=3.272939288 podStartE2EDuration="3.272939288s" podCreationTimestamp="2026-02-24 03:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:12:45.270633187 +0000 UTC m=+1089.287704010" watchObservedRunningTime="2026-02-24 03:12:45.272939288 +0000 UTC m=+1089.290010101" Feb 24 03:12:45 crc kubenswrapper[4923]: I0224 03:12:45.722526 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b" path="/var/lib/kubelet/pods/ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b/volumes" Feb 24 03:12:49 crc kubenswrapper[4923]: I0224 03:12:49.303372 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qq8ck" event={"ID":"49b6f21a-18fb-4c73-8286-32f1def45bac","Type":"ContainerStarted","Data":"5c2500c5dc77a1effb5c1da39cebeecfa432b765951689e57690b2fa16bc1a6f"} Feb 24 03:12:49 crc kubenswrapper[4923]: I0224 03:12:49.335937 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-qq8ck" podStartSLOduration=1.779763058 podStartE2EDuration="8.335910711s" podCreationTimestamp="2026-02-24 03:12:41 +0000 UTC" firstStartedPulling="2026-02-24 03:12:42.337038483 +0000 UTC m=+1086.354109296" lastFinishedPulling="2026-02-24 03:12:48.893186136 +0000 UTC m=+1092.910256949" observedRunningTime="2026-02-24 03:12:49.322023247 +0000 UTC m=+1093.339094100" watchObservedRunningTime="2026-02-24 03:12:49.335910711 +0000 UTC m=+1093.352981564" Feb 24 03:12:49 crc kubenswrapper[4923]: I0224 03:12:49.916949 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:12:49 crc kubenswrapper[4923]: I0224 03:12:49.917029 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:12:52 crc kubenswrapper[4923]: I0224 03:12:52.328573 4923 generic.go:334] "Generic (PLEG): container finished" podID="49b6f21a-18fb-4c73-8286-32f1def45bac" containerID="5c2500c5dc77a1effb5c1da39cebeecfa432b765951689e57690b2fa16bc1a6f" exitCode=0 Feb 24 03:12:52 crc kubenswrapper[4923]: I0224 03:12:52.328613 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qq8ck" event={"ID":"49b6f21a-18fb-4c73-8286-32f1def45bac","Type":"ContainerDied","Data":"5c2500c5dc77a1effb5c1da39cebeecfa432b765951689e57690b2fa16bc1a6f"} Feb 24 03:12:52 crc kubenswrapper[4923]: I0224 03:12:52.925412 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:12:52 crc kubenswrapper[4923]: I0224 03:12:52.996031 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-xdptp"] Feb 24 03:12:52 crc kubenswrapper[4923]: I0224 03:12:52.996251 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" podUID="def9fcfc-1366-4652-a4c7-aeec946c3d96" containerName="dnsmasq-dns" containerID="cri-o://feb6787c7e98f2fbe412f3b5398b72c45d5eb2cc9c230d730805b44e2cd1686f" gracePeriod=10 Feb 24 03:12:53 crc kubenswrapper[4923]: I0224 03:12:53.344336 4923 generic.go:334] "Generic (PLEG): container finished" podID="def9fcfc-1366-4652-a4c7-aeec946c3d96" containerID="feb6787c7e98f2fbe412f3b5398b72c45d5eb2cc9c230d730805b44e2cd1686f" exitCode=0 Feb 24 03:12:53 crc kubenswrapper[4923]: I0224 03:12:53.344644 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" event={"ID":"def9fcfc-1366-4652-a4c7-aeec946c3d96","Type":"ContainerDied","Data":"feb6787c7e98f2fbe412f3b5398b72c45d5eb2cc9c230d730805b44e2cd1686f"} Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.278127 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qq8ck" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.355207 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qq8ck" event={"ID":"49b6f21a-18fb-4c73-8286-32f1def45bac","Type":"ContainerDied","Data":"e3de9cb455cf11fb212dfb18d157434ad13aaeec971056819beacb231b5f4a5e"} Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.356042 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3de9cb455cf11fb212dfb18d157434ad13aaeec971056819beacb231b5f4a5e" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.355736 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qq8ck" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.412406 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhjln\" (UniqueName: \"kubernetes.io/projected/49b6f21a-18fb-4c73-8286-32f1def45bac-kube-api-access-vhjln\") pod \"49b6f21a-18fb-4c73-8286-32f1def45bac\" (UID: \"49b6f21a-18fb-4c73-8286-32f1def45bac\") " Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.412662 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b6f21a-18fb-4c73-8286-32f1def45bac-combined-ca-bundle\") pod \"49b6f21a-18fb-4c73-8286-32f1def45bac\" (UID: \"49b6f21a-18fb-4c73-8286-32f1def45bac\") " Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.412732 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b6f21a-18fb-4c73-8286-32f1def45bac-config-data\") pod \"49b6f21a-18fb-4c73-8286-32f1def45bac\" (UID: \"49b6f21a-18fb-4c73-8286-32f1def45bac\") " Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.418522 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b6f21a-18fb-4c73-8286-32f1def45bac-kube-api-access-vhjln" (OuterVolumeSpecName: "kube-api-access-vhjln") pod "49b6f21a-18fb-4c73-8286-32f1def45bac" (UID: "49b6f21a-18fb-4c73-8286-32f1def45bac"). InnerVolumeSpecName "kube-api-access-vhjln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.460907 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b6f21a-18fb-4c73-8286-32f1def45bac-config-data" (OuterVolumeSpecName: "config-data") pod "49b6f21a-18fb-4c73-8286-32f1def45bac" (UID: "49b6f21a-18fb-4c73-8286-32f1def45bac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.460992 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b6f21a-18fb-4c73-8286-32f1def45bac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49b6f21a-18fb-4c73-8286-32f1def45bac" (UID: "49b6f21a-18fb-4c73-8286-32f1def45bac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.514471 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b6f21a-18fb-4c73-8286-32f1def45bac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.514513 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b6f21a-18fb-4c73-8286-32f1def45bac-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.514524 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhjln\" (UniqueName: \"kubernetes.io/projected/49b6f21a-18fb-4c73-8286-32f1def45bac-kube-api-access-vhjln\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.632821 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-9thl9"] Feb 24 03:12:54 crc kubenswrapper[4923]: E0224 03:12:54.633140 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33de35a6-2c83-4492-8ca5-103c030007ea" containerName="mariadb-database-create" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.633156 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="33de35a6-2c83-4492-8ca5-103c030007ea" containerName="mariadb-database-create" Feb 24 03:12:54 crc kubenswrapper[4923]: E0224 03:12:54.633174 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7328613-da6b-4e74-9857-d0d8a799c505" containerName="mariadb-account-create-update" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.633180 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7328613-da6b-4e74-9857-d0d8a799c505" containerName="mariadb-account-create-update" Feb 24 03:12:54 crc kubenswrapper[4923]: E0224 03:12:54.633191 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b" containerName="dnsmasq-dns" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.633199 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b" containerName="dnsmasq-dns" Feb 24 03:12:54 crc kubenswrapper[4923]: E0224 03:12:54.633205 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813d78f2-56a0-4658-b5fd-ab17e99db899" containerName="mariadb-database-create" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.633212 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="813d78f2-56a0-4658-b5fd-ab17e99db899" containerName="mariadb-database-create" Feb 24 03:12:54 crc kubenswrapper[4923]: E0224 03:12:54.633221 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b6f21a-18fb-4c73-8286-32f1def45bac" containerName="keystone-db-sync" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.633227 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b6f21a-18fb-4c73-8286-32f1def45bac" containerName="keystone-db-sync" Feb 24 03:12:54 crc kubenswrapper[4923]: E0224 03:12:54.633246 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b" containerName="init" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.633252 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b" containerName="init" Feb 24 03:12:54 crc kubenswrapper[4923]: E0224 03:12:54.633263 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2194e053-012e-4478-aa2c-70dceb03dc7a" containerName="mariadb-database-create" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.633269 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="2194e053-012e-4478-aa2c-70dceb03dc7a" containerName="mariadb-database-create" Feb 24 03:12:54 crc kubenswrapper[4923]: E0224 03:12:54.633316 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb8e452-36b8-4359-b5ed-34499c3b4fa4" containerName="mariadb-account-create-update" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.633322 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb8e452-36b8-4359-b5ed-34499c3b4fa4" containerName="mariadb-account-create-update" Feb 24 03:12:54 crc kubenswrapper[4923]: E0224 03:12:54.633330 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02b56c32-f855-4c43-a006-c546a59e977f" containerName="mariadb-account-create-update" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.633336 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="02b56c32-f855-4c43-a006-c546a59e977f" containerName="mariadb-account-create-update" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.633475 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="33de35a6-2c83-4492-8ca5-103c030007ea" containerName="mariadb-database-create" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.633489 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb8e452-36b8-4359-b5ed-34499c3b4fa4" containerName="mariadb-account-create-update" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.633499 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="02b56c32-f855-4c43-a006-c546a59e977f" containerName="mariadb-account-create-update" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.633509 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff2c8cd7-82f3-44fc-8d69-4e45d9e21b5b" containerName="dnsmasq-dns" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.633521 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="2194e053-012e-4478-aa2c-70dceb03dc7a" containerName="mariadb-database-create" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.633529 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7328613-da6b-4e74-9857-d0d8a799c505" containerName="mariadb-account-create-update" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.633540 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="813d78f2-56a0-4658-b5fd-ab17e99db899" containerName="mariadb-database-create" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.633550 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b6f21a-18fb-4c73-8286-32f1def45bac" containerName="keystone-db-sync" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.634024 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.645824 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-8z89j"] Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.645853 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.647024 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.663367 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9thl9"] Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.692008 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-8z89j"] Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.819074 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-8z89j\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.819118 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-config-data\") pod \"keystone-bootstrap-9thl9\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.819147 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-combined-ca-bundle\") pod \"keystone-bootstrap-9thl9\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.819168 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-fernet-keys\") pod \"keystone-bootstrap-9thl9\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.819191 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svnsd\" (UniqueName: \"kubernetes.io/projected/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-kube-api-access-svnsd\") pod \"dnsmasq-dns-bbf5cc879-8z89j\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.819208 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-credential-keys\") pod \"keystone-bootstrap-9thl9\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.819226 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6tcg\" (UniqueName: \"kubernetes.io/projected/a06236b1-dc62-4def-877d-65184413973e-kube-api-access-q6tcg\") pod \"keystone-bootstrap-9thl9\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.819247 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-8z89j\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.819269 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-8z89j\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.819312 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-8z89j\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.819327 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-config\") pod \"dnsmasq-dns-bbf5cc879-8z89j\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.819386 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-scripts\") pod \"keystone-bootstrap-9thl9\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.835286 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-gdl2d"] Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.836527 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.843237 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.843337 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mmk8t" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.843446 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.851089 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gdl2d"] Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.881043 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-bc6fbcd4c-xk5sx"] Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.882432 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.887083 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.887289 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.887412 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-6m68b" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.901555 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.907966 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bc6fbcd4c-xk5sx"] Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.918572 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.920538 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.921026 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-8z89j\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.921065 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-config-data\") pod \"keystone-bootstrap-9thl9\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.921107 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-combined-ca-bundle\") pod \"keystone-bootstrap-9thl9\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.921131 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-fernet-keys\") pod \"keystone-bootstrap-9thl9\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.921161 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svnsd\" (UniqueName: \"kubernetes.io/projected/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-kube-api-access-svnsd\") pod \"dnsmasq-dns-bbf5cc879-8z89j\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.921179 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-credential-keys\") pod \"keystone-bootstrap-9thl9\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.921195 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6tcg\" (UniqueName: \"kubernetes.io/projected/a06236b1-dc62-4def-877d-65184413973e-kube-api-access-q6tcg\") pod \"keystone-bootstrap-9thl9\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.921216 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-8z89j\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.921242 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-8z89j\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.921281 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-8z89j\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.921315 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-config\") pod \"dnsmasq-dns-bbf5cc879-8z89j\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.921376 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-scripts\") pod \"keystone-bootstrap-9thl9\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.922250 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-8z89j\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.923829 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-8z89j\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.924398 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-8z89j\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.925672 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-config\") pod \"dnsmasq-dns-bbf5cc879-8z89j\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.925709 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-8z89j\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.928751 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.929023 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.929210 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-config-data\") pod \"keystone-bootstrap-9thl9\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.940757 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-scripts\") pod \"keystone-bootstrap-9thl9\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.940790 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-combined-ca-bundle\") pod \"keystone-bootstrap-9thl9\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.942348 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-fernet-keys\") pod \"keystone-bootstrap-9thl9\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.947909 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-credential-keys\") pod \"keystone-bootstrap-9thl9\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.962515 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svnsd\" (UniqueName: \"kubernetes.io/projected/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-kube-api-access-svnsd\") pod \"dnsmasq-dns-bbf5cc879-8z89j\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.965898 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6tcg\" (UniqueName: \"kubernetes.io/projected/a06236b1-dc62-4def-877d-65184413973e-kube-api-access-q6tcg\") pod \"keystone-bootstrap-9thl9\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.973134 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.973587 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.980484 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-brpzr"] Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.988272 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-brpzr" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.990258 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4m94q" Feb 24 03:12:54 crc kubenswrapper[4923]: I0224 03:12:54.990540 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:54.993724 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.002111 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-8z89j"] Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.002723 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.026751 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd-config\") pod \"neutron-db-sync-brpzr\" (UID: \"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd\") " pod="openstack/neutron-db-sync-brpzr" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.026791 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1710156b-5155-4340-8013-2f9e3d68be35-run-httpd\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.026810 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdwmk\" (UniqueName: \"kubernetes.io/projected/1710156b-5155-4340-8013-2f9e3d68be35-kube-api-access-tdwmk\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.026832 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-db-sync-config-data\") pod \"cinder-db-sync-gdl2d\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.026855 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bea6c18a-0093-4b3b-b56b-323a86181da5-config-data\") pod \"horizon-bc6fbcd4c-xk5sx\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.026870 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js5tl\" (UniqueName: \"kubernetes.io/projected/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-kube-api-access-js5tl\") pod \"cinder-db-sync-gdl2d\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.026886 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd-combined-ca-bundle\") pod \"neutron-db-sync-brpzr\" (UID: \"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd\") " pod="openstack/neutron-db-sync-brpzr" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.026905 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-scripts\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.026920 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-config-data\") pod \"cinder-db-sync-gdl2d\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.026953 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.026969 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bea6c18a-0093-4b3b-b56b-323a86181da5-horizon-secret-key\") pod \"horizon-bc6fbcd4c-xk5sx\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.026986 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnjnl\" (UniqueName: \"kubernetes.io/projected/bea6c18a-0093-4b3b-b56b-323a86181da5-kube-api-access-xnjnl\") pod \"horizon-bc6fbcd4c-xk5sx\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.027006 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-etc-machine-id\") pod \"cinder-db-sync-gdl2d\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.027020 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bea6c18a-0093-4b3b-b56b-323a86181da5-scripts\") pod \"horizon-bc6fbcd4c-xk5sx\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.027037 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnxdj\" (UniqueName: \"kubernetes.io/projected/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd-kube-api-access-jnxdj\") pod \"neutron-db-sync-brpzr\" (UID: \"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd\") " pod="openstack/neutron-db-sync-brpzr" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.027052 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1710156b-5155-4340-8013-2f9e3d68be35-log-httpd\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.027068 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-config-data\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.027096 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-scripts\") pod \"cinder-db-sync-gdl2d\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.027111 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bea6c18a-0093-4b3b-b56b-323a86181da5-logs\") pod \"horizon-bc6fbcd4c-xk5sx\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.027130 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-combined-ca-bundle\") pod \"cinder-db-sync-gdl2d\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.027150 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.043632 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-ds4gb"] Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.044689 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ds4gb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.047442 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.053531 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.053723 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-x4cdv" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133051 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6zbc\" (UniqueName: \"kubernetes.io/projected/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-kube-api-access-s6zbc\") pod \"placement-db-sync-ds4gb\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " pod="openstack/placement-db-sync-ds4gb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133124 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bea6c18a-0093-4b3b-b56b-323a86181da5-config-data\") pod \"horizon-bc6fbcd4c-xk5sx\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133155 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js5tl\" (UniqueName: \"kubernetes.io/projected/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-kube-api-access-js5tl\") pod \"cinder-db-sync-gdl2d\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133180 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd-combined-ca-bundle\") pod \"neutron-db-sync-brpzr\" (UID: \"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd\") " pod="openstack/neutron-db-sync-brpzr" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133199 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-combined-ca-bundle\") pod \"placement-db-sync-ds4gb\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " pod="openstack/placement-db-sync-ds4gb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133258 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-scripts\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133278 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-config-data\") pod \"cinder-db-sync-gdl2d\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133461 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133493 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bea6c18a-0093-4b3b-b56b-323a86181da5-horizon-secret-key\") pod \"horizon-bc6fbcd4c-xk5sx\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133526 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnjnl\" (UniqueName: \"kubernetes.io/projected/bea6c18a-0093-4b3b-b56b-323a86181da5-kube-api-access-xnjnl\") pod \"horizon-bc6fbcd4c-xk5sx\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133573 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-scripts\") pod \"placement-db-sync-ds4gb\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " pod="openstack/placement-db-sync-ds4gb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133597 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-etc-machine-id\") pod \"cinder-db-sync-gdl2d\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133616 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bea6c18a-0093-4b3b-b56b-323a86181da5-scripts\") pod \"horizon-bc6fbcd4c-xk5sx\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133646 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnxdj\" (UniqueName: \"kubernetes.io/projected/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd-kube-api-access-jnxdj\") pod \"neutron-db-sync-brpzr\" (UID: \"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd\") " pod="openstack/neutron-db-sync-brpzr" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133669 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1710156b-5155-4340-8013-2f9e3d68be35-log-httpd\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133694 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-config-data\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133764 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-scripts\") pod \"cinder-db-sync-gdl2d\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133785 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bea6c18a-0093-4b3b-b56b-323a86181da5-logs\") pod \"horizon-bc6fbcd4c-xk5sx\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133812 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-combined-ca-bundle\") pod \"cinder-db-sync-gdl2d\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133834 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-logs\") pod \"placement-db-sync-ds4gb\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " pod="openstack/placement-db-sync-ds4gb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133864 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133930 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd-config\") pod \"neutron-db-sync-brpzr\" (UID: \"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd\") " pod="openstack/neutron-db-sync-brpzr" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133952 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1710156b-5155-4340-8013-2f9e3d68be35-run-httpd\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.133977 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdwmk\" (UniqueName: \"kubernetes.io/projected/1710156b-5155-4340-8013-2f9e3d68be35-kube-api-access-tdwmk\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.134001 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-config-data\") pod \"placement-db-sync-ds4gb\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " pod="openstack/placement-db-sync-ds4gb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.134028 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-db-sync-config-data\") pod \"cinder-db-sync-gdl2d\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.138422 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bea6c18a-0093-4b3b-b56b-323a86181da5-scripts\") pod \"horizon-bc6fbcd4c-xk5sx\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.138612 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bea6c18a-0093-4b3b-b56b-323a86181da5-config-data\") pod \"horizon-bc6fbcd4c-xk5sx\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.139995 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-etc-machine-id\") pod \"cinder-db-sync-gdl2d\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.142673 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-brpzr"] Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.150915 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1710156b-5155-4340-8013-2f9e3d68be35-run-httpd\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.151493 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-db-sync-config-data\") pod \"cinder-db-sync-gdl2d\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.154169 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1710156b-5155-4340-8013-2f9e3d68be35-log-httpd\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.154541 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bea6c18a-0093-4b3b-b56b-323a86181da5-logs\") pod \"horizon-bc6fbcd4c-xk5sx\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.159146 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-combined-ca-bundle\") pod \"cinder-db-sync-gdl2d\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.160169 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd-config\") pod \"neutron-db-sync-brpzr\" (UID: \"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd\") " pod="openstack/neutron-db-sync-brpzr" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.160710 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.161155 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bea6c18a-0093-4b3b-b56b-323a86181da5-horizon-secret-key\") pod \"horizon-bc6fbcd4c-xk5sx\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.162962 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.165130 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-scripts\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.167095 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-config-data\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.168287 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnjnl\" (UniqueName: \"kubernetes.io/projected/bea6c18a-0093-4b3b-b56b-323a86181da5-kube-api-access-xnjnl\") pod \"horizon-bc6fbcd4c-xk5sx\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.171906 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js5tl\" (UniqueName: \"kubernetes.io/projected/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-kube-api-access-js5tl\") pod \"cinder-db-sync-gdl2d\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.178687 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnxdj\" (UniqueName: \"kubernetes.io/projected/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd-kube-api-access-jnxdj\") pod \"neutron-db-sync-brpzr\" (UID: \"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd\") " pod="openstack/neutron-db-sync-brpzr" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.185714 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdwmk\" (UniqueName: \"kubernetes.io/projected/1710156b-5155-4340-8013-2f9e3d68be35-kube-api-access-tdwmk\") pod \"ceilometer-0\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.202551 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.203737 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd-combined-ca-bundle\") pod \"neutron-db-sync-brpzr\" (UID: \"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd\") " pod="openstack/neutron-db-sync-brpzr" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.204057 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-config-data\") pod \"cinder-db-sync-gdl2d\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.212059 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-scripts\") pod \"cinder-db-sync-gdl2d\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.228730 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ds4gb"] Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.235980 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-combined-ca-bundle\") pod \"placement-db-sync-ds4gb\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " pod="openstack/placement-db-sync-ds4gb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.236065 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-scripts\") pod \"placement-db-sync-ds4gb\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " pod="openstack/placement-db-sync-ds4gb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.236116 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-logs\") pod \"placement-db-sync-ds4gb\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " pod="openstack/placement-db-sync-ds4gb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.236162 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-config-data\") pod \"placement-db-sync-ds4gb\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " pod="openstack/placement-db-sync-ds4gb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.236188 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6zbc\" (UniqueName: \"kubernetes.io/projected/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-kube-api-access-s6zbc\") pod \"placement-db-sync-ds4gb\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " pod="openstack/placement-db-sync-ds4gb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.237673 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-logs\") pod \"placement-db-sync-ds4gb\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " pod="openstack/placement-db-sync-ds4gb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.240463 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-config-data\") pod \"placement-db-sync-ds4gb\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " pod="openstack/placement-db-sync-ds4gb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.250752 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-scripts\") pod \"placement-db-sync-ds4gb\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " pod="openstack/placement-db-sync-ds4gb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.251226 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-combined-ca-bundle\") pod \"placement-db-sync-ds4gb\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " pod="openstack/placement-db-sync-ds4gb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.340896 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6zbc\" (UniqueName: \"kubernetes.io/projected/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-kube-api-access-s6zbc\") pod \"placement-db-sync-ds4gb\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " pod="openstack/placement-db-sync-ds4gb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.341335 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.356290 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-brpzr" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.402402 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-s64ld"] Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.404090 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.417647 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ds4gb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.419023 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-rvlxb"] Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.420146 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rvlxb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.434395 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-75cfdb654f-g2wdh"] Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.435637 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-hj69t" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.435826 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.441471 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-s64ld"] Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.443208 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.444392 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29f64c48-5ed4-431c-8636-702a8abf02b5-db-sync-config-data\") pod \"barbican-db-sync-rvlxb\" (UID: \"29f64c48-5ed4-431c-8636-702a8abf02b5\") " pod="openstack/barbican-db-sync-rvlxb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.444416 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-s64ld\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.444450 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-config\") pod \"dnsmasq-dns-56df8fb6b7-s64ld\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.444469 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/059fc35a-7b75-46b7-86f3-7b05fb19c5de-horizon-secret-key\") pod \"horizon-75cfdb654f-g2wdh\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.444486 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-s64ld\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.444501 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-s64ld\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.444538 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/059fc35a-7b75-46b7-86f3-7b05fb19c5de-config-data\") pod \"horizon-75cfdb654f-g2wdh\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.444577 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvglw\" (UniqueName: \"kubernetes.io/projected/059fc35a-7b75-46b7-86f3-7b05fb19c5de-kube-api-access-xvglw\") pod \"horizon-75cfdb654f-g2wdh\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.444601 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2q8s\" (UniqueName: \"kubernetes.io/projected/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-kube-api-access-f2q8s\") pod \"dnsmasq-dns-56df8fb6b7-s64ld\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.444618 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9skx\" (UniqueName: \"kubernetes.io/projected/29f64c48-5ed4-431c-8636-702a8abf02b5-kube-api-access-f9skx\") pod \"barbican-db-sync-rvlxb\" (UID: \"29f64c48-5ed4-431c-8636-702a8abf02b5\") " pod="openstack/barbican-db-sync-rvlxb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.444640 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f64c48-5ed4-431c-8636-702a8abf02b5-combined-ca-bundle\") pod \"barbican-db-sync-rvlxb\" (UID: \"29f64c48-5ed4-431c-8636-702a8abf02b5\") " pod="openstack/barbican-db-sync-rvlxb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.444660 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/059fc35a-7b75-46b7-86f3-7b05fb19c5de-scripts\") pod \"horizon-75cfdb654f-g2wdh\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.444679 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/059fc35a-7b75-46b7-86f3-7b05fb19c5de-logs\") pod \"horizon-75cfdb654f-g2wdh\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.444693 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-s64ld\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.453145 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.455579 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rvlxb"] Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.484314 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75cfdb654f-g2wdh"] Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.499667 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.500988 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.504531 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.505621 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.505889 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-p2qxr" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.506023 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.510351 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.516908 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.518869 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.522538 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.522922 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.537659 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.547349 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-config\") pod \"dnsmasq-dns-56df8fb6b7-s64ld\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.547385 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/059fc35a-7b75-46b7-86f3-7b05fb19c5de-horizon-secret-key\") pod \"horizon-75cfdb654f-g2wdh\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.547402 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-s64ld\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.547421 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-s64ld\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.547460 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/059fc35a-7b75-46b7-86f3-7b05fb19c5de-config-data\") pod \"horizon-75cfdb654f-g2wdh\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.547497 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvglw\" (UniqueName: \"kubernetes.io/projected/059fc35a-7b75-46b7-86f3-7b05fb19c5de-kube-api-access-xvglw\") pod \"horizon-75cfdb654f-g2wdh\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.547514 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2q8s\" (UniqueName: \"kubernetes.io/projected/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-kube-api-access-f2q8s\") pod \"dnsmasq-dns-56df8fb6b7-s64ld\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.547536 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9skx\" (UniqueName: \"kubernetes.io/projected/29f64c48-5ed4-431c-8636-702a8abf02b5-kube-api-access-f9skx\") pod \"barbican-db-sync-rvlxb\" (UID: \"29f64c48-5ed4-431c-8636-702a8abf02b5\") " pod="openstack/barbican-db-sync-rvlxb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.547555 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f64c48-5ed4-431c-8636-702a8abf02b5-combined-ca-bundle\") pod \"barbican-db-sync-rvlxb\" (UID: \"29f64c48-5ed4-431c-8636-702a8abf02b5\") " pod="openstack/barbican-db-sync-rvlxb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.547579 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/059fc35a-7b75-46b7-86f3-7b05fb19c5de-scripts\") pod \"horizon-75cfdb654f-g2wdh\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.547600 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/059fc35a-7b75-46b7-86f3-7b05fb19c5de-logs\") pod \"horizon-75cfdb654f-g2wdh\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.547616 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-s64ld\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.547651 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29f64c48-5ed4-431c-8636-702a8abf02b5-db-sync-config-data\") pod \"barbican-db-sync-rvlxb\" (UID: \"29f64c48-5ed4-431c-8636-702a8abf02b5\") " pod="openstack/barbican-db-sync-rvlxb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.547668 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-s64ld\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.549451 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-config\") pod \"dnsmasq-dns-56df8fb6b7-s64ld\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.550908 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-s64ld\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.551803 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/059fc35a-7b75-46b7-86f3-7b05fb19c5de-scripts\") pod \"horizon-75cfdb654f-g2wdh\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.551874 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/059fc35a-7b75-46b7-86f3-7b05fb19c5de-config-data\") pod \"horizon-75cfdb654f-g2wdh\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.552469 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-s64ld\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.552739 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/059fc35a-7b75-46b7-86f3-7b05fb19c5de-logs\") pod \"horizon-75cfdb654f-g2wdh\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.553022 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-s64ld\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.555677 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-s64ld\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.555958 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/059fc35a-7b75-46b7-86f3-7b05fb19c5de-horizon-secret-key\") pod \"horizon-75cfdb654f-g2wdh\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.578389 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29f64c48-5ed4-431c-8636-702a8abf02b5-db-sync-config-data\") pod \"barbican-db-sync-rvlxb\" (UID: \"29f64c48-5ed4-431c-8636-702a8abf02b5\") " pod="openstack/barbican-db-sync-rvlxb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.578703 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9skx\" (UniqueName: \"kubernetes.io/projected/29f64c48-5ed4-431c-8636-702a8abf02b5-kube-api-access-f9skx\") pod \"barbican-db-sync-rvlxb\" (UID: \"29f64c48-5ed4-431c-8636-702a8abf02b5\") " pod="openstack/barbican-db-sync-rvlxb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.579424 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f64c48-5ed4-431c-8636-702a8abf02b5-combined-ca-bundle\") pod \"barbican-db-sync-rvlxb\" (UID: \"29f64c48-5ed4-431c-8636-702a8abf02b5\") " pod="openstack/barbican-db-sync-rvlxb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.580127 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvglw\" (UniqueName: \"kubernetes.io/projected/059fc35a-7b75-46b7-86f3-7b05fb19c5de-kube-api-access-xvglw\") pod \"horizon-75cfdb654f-g2wdh\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.581665 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2q8s\" (UniqueName: \"kubernetes.io/projected/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-kube-api-access-f2q8s\") pod \"dnsmasq-dns-56df8fb6b7-s64ld\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.601947 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rvlxb" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.627531 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.651067 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-scripts\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.651102 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.651128 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrnx8\" (UniqueName: \"kubernetes.io/projected/4b0f2318-6095-4c1c-8f68-3e897f966ca2-kube-api-access-qrnx8\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.651151 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.651171 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.651187 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.651210 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-logs\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.651228 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b0f2318-6095-4c1c-8f68-3e897f966ca2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.651244 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-config-data\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.651286 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.651327 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.651345 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvjkc\" (UniqueName: \"kubernetes.io/projected/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-kube-api-access-jvjkc\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.651370 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.651410 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.651456 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b0f2318-6095-4c1c-8f68-3e897f966ca2-logs\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.651471 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.752845 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b0f2318-6095-4c1c-8f68-3e897f966ca2-logs\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.752890 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.752951 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-scripts\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.752984 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.753014 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrnx8\" (UniqueName: \"kubernetes.io/projected/4b0f2318-6095-4c1c-8f68-3e897f966ca2-kube-api-access-qrnx8\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.753044 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.753080 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.753102 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.753157 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-logs\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.753194 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b0f2318-6095-4c1c-8f68-3e897f966ca2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.753217 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-config-data\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.753259 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.753485 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.753515 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvjkc\" (UniqueName: \"kubernetes.io/projected/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-kube-api-access-jvjkc\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.753562 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.753604 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.754546 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b0f2318-6095-4c1c-8f68-3e897f966ca2-logs\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.755121 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-logs\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.755144 4923 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.756620 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.756944 4923 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.761094 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b0f2318-6095-4c1c-8f68-3e897f966ca2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.768038 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.768901 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.770261 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-config-data\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.771927 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.772796 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-scripts\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.778011 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.783411 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.790004 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.790126 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.790418 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvjkc\" (UniqueName: \"kubernetes.io/projected/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-kube-api-access-jvjkc\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.790612 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrnx8\" (UniqueName: \"kubernetes.io/projected/4b0f2318-6095-4c1c-8f68-3e897f966ca2-kube-api-access-qrnx8\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.841035 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.877252 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.943727 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.952969 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 24 03:12:55 crc kubenswrapper[4923]: I0224 03:12:55.960024 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9thl9"] Feb 24 03:12:55 crc kubenswrapper[4923]: W0224 03:12:55.965835 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda06236b1_dc62_4def_877d_65184413973e.slice/crio-a8db3ab4bafe04a70349384b887a95f78477a3158d01b6a459147eb8050d2314 WatchSource:0}: Error finding container a8db3ab4bafe04a70349384b887a95f78477a3158d01b6a459147eb8050d2314: Status 404 returned error can't find the container with id a8db3ab4bafe04a70349384b887a95f78477a3158d01b6a459147eb8050d2314 Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.185853 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.199310 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-8z89j"] Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.265058 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rst7l\" (UniqueName: \"kubernetes.io/projected/def9fcfc-1366-4652-a4c7-aeec946c3d96-kube-api-access-rst7l\") pod \"def9fcfc-1366-4652-a4c7-aeec946c3d96\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.265114 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-ovsdbserver-sb\") pod \"def9fcfc-1366-4652-a4c7-aeec946c3d96\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.265198 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-config\") pod \"def9fcfc-1366-4652-a4c7-aeec946c3d96\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.265312 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-ovsdbserver-nb\") pod \"def9fcfc-1366-4652-a4c7-aeec946c3d96\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.265363 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-dns-svc\") pod \"def9fcfc-1366-4652-a4c7-aeec946c3d96\" (UID: \"def9fcfc-1366-4652-a4c7-aeec946c3d96\") " Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.272499 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def9fcfc-1366-4652-a4c7-aeec946c3d96-kube-api-access-rst7l" (OuterVolumeSpecName: "kube-api-access-rst7l") pod "def9fcfc-1366-4652-a4c7-aeec946c3d96" (UID: "def9fcfc-1366-4652-a4c7-aeec946c3d96"). InnerVolumeSpecName "kube-api-access-rst7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.312713 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-config" (OuterVolumeSpecName: "config") pod "def9fcfc-1366-4652-a4c7-aeec946c3d96" (UID: "def9fcfc-1366-4652-a4c7-aeec946c3d96"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.315222 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "def9fcfc-1366-4652-a4c7-aeec946c3d96" (UID: "def9fcfc-1366-4652-a4c7-aeec946c3d96"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.349124 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "def9fcfc-1366-4652-a4c7-aeec946c3d96" (UID: "def9fcfc-1366-4652-a4c7-aeec946c3d96"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.350105 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "def9fcfc-1366-4652-a4c7-aeec946c3d96" (UID: "def9fcfc-1366-4652-a4c7-aeec946c3d96"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.368635 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rst7l\" (UniqueName: \"kubernetes.io/projected/def9fcfc-1366-4652-a4c7-aeec946c3d96-kube-api-access-rst7l\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.368666 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.368676 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.368700 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.368709 4923 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/def9fcfc-1366-4652-a4c7-aeec946c3d96-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.428885 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" event={"ID":"def9fcfc-1366-4652-a4c7-aeec946c3d96","Type":"ContainerDied","Data":"8ac23df67b5af937f79a19b81c22be0007ac41c1ae5b584b5f3929611f7509bc"} Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.428943 4923 scope.go:117] "RemoveContainer" containerID="feb6787c7e98f2fbe412f3b5398b72c45d5eb2cc9c230d730805b44e2cd1686f" Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.430863 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" event={"ID":"19a85a5c-3407-4f5c-bbe7-72550f09bdf0","Type":"ContainerStarted","Data":"2741b3864c4b4a9489fee381c277a9981b11cda68b8ea1ee7909a7501e4ce845"} Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.431902 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-xdptp" Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.432872 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9thl9" event={"ID":"a06236b1-dc62-4def-877d-65184413973e","Type":"ContainerStarted","Data":"5a85c6672f1a475a0cbaf700fa58bad085c139f5bb8fd40c166325f0e40fa6b0"} Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.432918 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9thl9" event={"ID":"a06236b1-dc62-4def-877d-65184413973e","Type":"ContainerStarted","Data":"a8db3ab4bafe04a70349384b887a95f78477a3158d01b6a459147eb8050d2314"} Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.449567 4923 scope.go:117] "RemoveContainer" containerID="90aa696378c45d2fa75d78100316e113dffe4e97ce18a3c0a6c3593deb8ea805" Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.462606 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-9thl9" podStartSLOduration=2.46258275 podStartE2EDuration="2.46258275s" podCreationTimestamp="2026-02-24 03:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:12:56.460528357 +0000 UTC m=+1100.477599180" watchObservedRunningTime="2026-02-24 03:12:56.46258275 +0000 UTC m=+1100.479653563" Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.500798 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-xdptp"] Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.525005 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-xdptp"] Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.542429 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-brpzr"] Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.553411 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bc6fbcd4c-xk5sx"] Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.573821 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.585953 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ds4gb"] Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.591400 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gdl2d"] Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.597752 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75cfdb654f-g2wdh"] Feb 24 03:12:56 crc kubenswrapper[4923]: W0224 03:12:56.632040 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ad7cefc_c3bb_48ff_ab05_0fe707823e84.slice/crio-047b94f6e9abd8bf2480128f928e3277ef8ecd15674f7d7aa9951c84eeb87bc6 WatchSource:0}: Error finding container 047b94f6e9abd8bf2480128f928e3277ef8ecd15674f7d7aa9951c84eeb87bc6: Status 404 returned error can't find the container with id 047b94f6e9abd8bf2480128f928e3277ef8ecd15674f7d7aa9951c84eeb87bc6 Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.665282 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-s64ld"] Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.668148 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rvlxb"] Feb 24 03:12:56 crc kubenswrapper[4923]: W0224 03:12:56.680239 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33493b8c_7d7c_4ad5_9b81_12da1ae17aee.slice/crio-f426205667284284d04d2bc425198a06bdefb22ed3c38920065e340bea360cda WatchSource:0}: Error finding container f426205667284284d04d2bc425198a06bdefb22ed3c38920065e340bea360cda: Status 404 returned error can't find the container with id f426205667284284d04d2bc425198a06bdefb22ed3c38920065e340bea360cda Feb 24 03:12:56 crc kubenswrapper[4923]: W0224 03:12:56.866703 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod218da1a6_d1a3_4c71_8fe5_785e6d505c0a.slice/crio-f2095ccf935f47bd865d3aa8dece5be6fe2d39cf6b5e46ed0b9f0fcf34968384 WatchSource:0}: Error finding container f2095ccf935f47bd865d3aa8dece5be6fe2d39cf6b5e46ed0b9f0fcf34968384: Status 404 returned error can't find the container with id f2095ccf935f47bd865d3aa8dece5be6fe2d39cf6b5e46ed0b9f0fcf34968384 Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.867716 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 24 03:12:56 crc kubenswrapper[4923]: I0224 03:12:56.962633 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.030966 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bc6fbcd4c-xk5sx"] Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.066859 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-575b899b65-wzw2v"] Feb 24 03:12:57 crc kubenswrapper[4923]: E0224 03:12:57.067263 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def9fcfc-1366-4652-a4c7-aeec946c3d96" containerName="dnsmasq-dns" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.067275 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="def9fcfc-1366-4652-a4c7-aeec946c3d96" containerName="dnsmasq-dns" Feb 24 03:12:57 crc kubenswrapper[4923]: E0224 03:12:57.067312 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def9fcfc-1366-4652-a4c7-aeec946c3d96" containerName="init" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.067318 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="def9fcfc-1366-4652-a4c7-aeec946c3d96" containerName="init" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.067498 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="def9fcfc-1366-4652-a4c7-aeec946c3d96" containerName="dnsmasq-dns" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.068449 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.075344 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.099480 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.118070 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-575b899b65-wzw2v"] Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.176925 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.191410 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-horizon-secret-key\") pod \"horizon-575b899b65-wzw2v\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.191489 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-scripts\") pod \"horizon-575b899b65-wzw2v\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.191657 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9c49\" (UniqueName: \"kubernetes.io/projected/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-kube-api-access-z9c49\") pod \"horizon-575b899b65-wzw2v\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.191892 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-config-data\") pod \"horizon-575b899b65-wzw2v\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.191951 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-logs\") pod \"horizon-575b899b65-wzw2v\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:12:57 crc kubenswrapper[4923]: W0224 03:12:57.224260 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b0f2318_6095_4c1c_8f68_3e897f966ca2.slice/crio-8a0c3be26b5c7bbe2013bc6b7ff044b5074acb7da3f65736104cbfbf5730ba96 WatchSource:0}: Error finding container 8a0c3be26b5c7bbe2013bc6b7ff044b5074acb7da3f65736104cbfbf5730ba96: Status 404 returned error can't find the container with id 8a0c3be26b5c7bbe2013bc6b7ff044b5074acb7da3f65736104cbfbf5730ba96 Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.301170 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9c49\" (UniqueName: \"kubernetes.io/projected/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-kube-api-access-z9c49\") pod \"horizon-575b899b65-wzw2v\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.301284 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-config-data\") pod \"horizon-575b899b65-wzw2v\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.301389 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-logs\") pod \"horizon-575b899b65-wzw2v\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.301429 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-horizon-secret-key\") pod \"horizon-575b899b65-wzw2v\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.301461 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-scripts\") pod \"horizon-575b899b65-wzw2v\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.302604 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-scripts\") pod \"horizon-575b899b65-wzw2v\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.302980 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-config-data\") pod \"horizon-575b899b65-wzw2v\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.302984 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-logs\") pod \"horizon-575b899b65-wzw2v\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.307515 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-horizon-secret-key\") pod \"horizon-575b899b65-wzw2v\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.319336 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9c49\" (UniqueName: \"kubernetes.io/projected/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-kube-api-access-z9c49\") pod \"horizon-575b899b65-wzw2v\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.416098 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.450905 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1710156b-5155-4340-8013-2f9e3d68be35","Type":"ContainerStarted","Data":"69f4fb557fceb5fbc8b66c75a10cb44a33f78eac04c65a379bfd42adb35d6121"} Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.453403 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bc6fbcd4c-xk5sx" event={"ID":"bea6c18a-0093-4b3b-b56b-323a86181da5","Type":"ContainerStarted","Data":"adbc3c3b27b2af992f4955fc8858e13898ae934fc89ce290fa1cc73c9efa32fe"} Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.456348 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ds4gb" event={"ID":"d0d1f021-7b1a-491b-9dd5-90d6425bcde7","Type":"ContainerStarted","Data":"f5b028dd0787b75b9d00df4f6c5fccbbab71674a30d7a99ad29ad0b4dea850bf"} Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.458223 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75cfdb654f-g2wdh" event={"ID":"059fc35a-7b75-46b7-86f3-7b05fb19c5de","Type":"ContainerStarted","Data":"6d078c76b86ddbfb1d26b5c70bb353cb25212682c9c4491cad1f547a24ef9f67"} Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.460808 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rvlxb" event={"ID":"29f64c48-5ed4-431c-8636-702a8abf02b5","Type":"ContainerStarted","Data":"c601da1bc4bf317ab6af2fd9015d98846f670fd2e4a1070eae2ef6734e84ab96"} Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.464042 4923 generic.go:334] "Generic (PLEG): container finished" podID="33493b8c-7d7c-4ad5-9b81-12da1ae17aee" containerID="df75237edd7c81620d9c0ebdc1abe57c3ca54b8c99ac10a55fb50a3bf608f301" exitCode=0 Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.464081 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" event={"ID":"33493b8c-7d7c-4ad5-9b81-12da1ae17aee","Type":"ContainerDied","Data":"df75237edd7c81620d9c0ebdc1abe57c3ca54b8c99ac10a55fb50a3bf608f301"} Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.464096 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" event={"ID":"33493b8c-7d7c-4ad5-9b81-12da1ae17aee","Type":"ContainerStarted","Data":"f426205667284284d04d2bc425198a06bdefb22ed3c38920065e340bea360cda"} Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.468892 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gdl2d" event={"ID":"8ad7cefc-c3bb-48ff-ab05-0fe707823e84","Type":"ContainerStarted","Data":"047b94f6e9abd8bf2480128f928e3277ef8ecd15674f7d7aa9951c84eeb87bc6"} Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.476023 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"218da1a6-d1a3-4c71-8fe5-785e6d505c0a","Type":"ContainerStarted","Data":"f2095ccf935f47bd865d3aa8dece5be6fe2d39cf6b5e46ed0b9f0fcf34968384"} Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.481115 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4b0f2318-6095-4c1c-8f68-3e897f966ca2","Type":"ContainerStarted","Data":"8a0c3be26b5c7bbe2013bc6b7ff044b5074acb7da3f65736104cbfbf5730ba96"} Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.493217 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-brpzr" event={"ID":"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd","Type":"ContainerStarted","Data":"02795f14259e3be3db9608da5152c23a7606f40060e396cfffc3b567c9ffc3ec"} Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.493258 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-brpzr" event={"ID":"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd","Type":"ContainerStarted","Data":"c2fc2120b22c66907835119e8581c2231b1f52a807d40c234e77798389224409"} Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.498401 4923 generic.go:334] "Generic (PLEG): container finished" podID="19a85a5c-3407-4f5c-bbe7-72550f09bdf0" containerID="92eaf11b2a809c8be759bb0aa6eb4db2f580de52c9ab1ed539fe6859f5d298f5" exitCode=0 Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.498502 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" event={"ID":"19a85a5c-3407-4f5c-bbe7-72550f09bdf0","Type":"ContainerDied","Data":"92eaf11b2a809c8be759bb0aa6eb4db2f580de52c9ab1ed539fe6859f5d298f5"} Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.507874 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-brpzr" podStartSLOduration=3.507860438 podStartE2EDuration="3.507860438s" podCreationTimestamp="2026-02-24 03:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:12:57.505566228 +0000 UTC m=+1101.522637041" watchObservedRunningTime="2026-02-24 03:12:57.507860438 +0000 UTC m=+1101.524931251" Feb 24 03:12:57 crc kubenswrapper[4923]: I0224 03:12:57.743419 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="def9fcfc-1366-4652-a4c7-aeec946c3d96" path="/var/lib/kubelet/pods/def9fcfc-1366-4652-a4c7-aeec946c3d96/volumes" Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.075775 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.132517 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-575b899b65-wzw2v"] Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.230821 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-config\") pod \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.230876 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-ovsdbserver-sb\") pod \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.230936 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-ovsdbserver-nb\") pod \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.230998 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-dns-swift-storage-0\") pod \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.231115 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-dns-svc\") pod \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.231201 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svnsd\" (UniqueName: \"kubernetes.io/projected/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-kube-api-access-svnsd\") pod \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\" (UID: \"19a85a5c-3407-4f5c-bbe7-72550f09bdf0\") " Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.245484 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-kube-api-access-svnsd" (OuterVolumeSpecName: "kube-api-access-svnsd") pod "19a85a5c-3407-4f5c-bbe7-72550f09bdf0" (UID: "19a85a5c-3407-4f5c-bbe7-72550f09bdf0"). InnerVolumeSpecName "kube-api-access-svnsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.261068 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "19a85a5c-3407-4f5c-bbe7-72550f09bdf0" (UID: "19a85a5c-3407-4f5c-bbe7-72550f09bdf0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.272842 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "19a85a5c-3407-4f5c-bbe7-72550f09bdf0" (UID: "19a85a5c-3407-4f5c-bbe7-72550f09bdf0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.281068 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "19a85a5c-3407-4f5c-bbe7-72550f09bdf0" (UID: "19a85a5c-3407-4f5c-bbe7-72550f09bdf0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.281362 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "19a85a5c-3407-4f5c-bbe7-72550f09bdf0" (UID: "19a85a5c-3407-4f5c-bbe7-72550f09bdf0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.294429 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-config" (OuterVolumeSpecName: "config") pod "19a85a5c-3407-4f5c-bbe7-72550f09bdf0" (UID: "19a85a5c-3407-4f5c-bbe7-72550f09bdf0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.332910 4923 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.332953 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svnsd\" (UniqueName: \"kubernetes.io/projected/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-kube-api-access-svnsd\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.332972 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.332986 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.332998 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.333011 4923 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19a85a5c-3407-4f5c-bbe7-72550f09bdf0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.524821 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" event={"ID":"33493b8c-7d7c-4ad5-9b81-12da1ae17aee","Type":"ContainerStarted","Data":"22d281b97ce0278ea23d502288dd12ed402dd1839acaed7bfa6bbfc2874f3546"} Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.526041 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.530889 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575b899b65-wzw2v" event={"ID":"f3e8092d-b9af-4e2f-a5f1-0682e2eff867","Type":"ContainerStarted","Data":"e003e85df27da7d276da5759ffbd9359a348f7ff905dd51ceaedcdf6cf8fd192"} Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.533032 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" event={"ID":"19a85a5c-3407-4f5c-bbe7-72550f09bdf0","Type":"ContainerDied","Data":"2741b3864c4b4a9489fee381c277a9981b11cda68b8ea1ee7909a7501e4ce845"} Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.533065 4923 scope.go:117] "RemoveContainer" containerID="92eaf11b2a809c8be759bb0aa6eb4db2f580de52c9ab1ed539fe6859f5d298f5" Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.533174 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-8z89j" Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.543476 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"218da1a6-d1a3-4c71-8fe5-785e6d505c0a","Type":"ContainerStarted","Data":"f56e856f839ebd36b3dc62c68002d7d85f952bfa98eff170165de03d9c8a342c"} Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.553683 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" podStartSLOduration=3.553665489 podStartE2EDuration="3.553665489s" podCreationTimestamp="2026-02-24 03:12:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:12:58.549662374 +0000 UTC m=+1102.566733187" watchObservedRunningTime="2026-02-24 03:12:58.553665489 +0000 UTC m=+1102.570736292" Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.700725 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-8z89j"] Feb 24 03:12:58 crc kubenswrapper[4923]: I0224 03:12:58.723005 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-8z89j"] Feb 24 03:12:59 crc kubenswrapper[4923]: I0224 03:12:59.628503 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"218da1a6-d1a3-4c71-8fe5-785e6d505c0a","Type":"ContainerStarted","Data":"e1c6d1c25b04393f5d91675018ce2fc029cbadbd5abee9a2237e37e90d784a7c"} Feb 24 03:12:59 crc kubenswrapper[4923]: I0224 03:12:59.629006 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="218da1a6-d1a3-4c71-8fe5-785e6d505c0a" containerName="glance-log" containerID="cri-o://f56e856f839ebd36b3dc62c68002d7d85f952bfa98eff170165de03d9c8a342c" gracePeriod=30 Feb 24 03:12:59 crc kubenswrapper[4923]: I0224 03:12:59.629912 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="218da1a6-d1a3-4c71-8fe5-785e6d505c0a" containerName="glance-httpd" containerID="cri-o://e1c6d1c25b04393f5d91675018ce2fc029cbadbd5abee9a2237e37e90d784a7c" gracePeriod=30 Feb 24 03:12:59 crc kubenswrapper[4923]: I0224 03:12:59.648879 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4b0f2318-6095-4c1c-8f68-3e897f966ca2","Type":"ContainerStarted","Data":"6a91a07ccc337d5c77ccf539943b73b40e13e07c932e9de47a7a76ba7c316a93"} Feb 24 03:12:59 crc kubenswrapper[4923]: I0224 03:12:59.674269 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.674243041 podStartE2EDuration="4.674243041s" podCreationTimestamp="2026-02-24 03:12:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:12:59.664320542 +0000 UTC m=+1103.681391355" watchObservedRunningTime="2026-02-24 03:12:59.674243041 +0000 UTC m=+1103.691313854" Feb 24 03:12:59 crc kubenswrapper[4923]: I0224 03:12:59.728795 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19a85a5c-3407-4f5c-bbe7-72550f09bdf0" path="/var/lib/kubelet/pods/19a85a5c-3407-4f5c-bbe7-72550f09bdf0/volumes" Feb 24 03:13:00 crc kubenswrapper[4923]: I0224 03:13:00.659426 4923 generic.go:334] "Generic (PLEG): container finished" podID="218da1a6-d1a3-4c71-8fe5-785e6d505c0a" containerID="e1c6d1c25b04393f5d91675018ce2fc029cbadbd5abee9a2237e37e90d784a7c" exitCode=0 Feb 24 03:13:00 crc kubenswrapper[4923]: I0224 03:13:00.659785 4923 generic.go:334] "Generic (PLEG): container finished" podID="218da1a6-d1a3-4c71-8fe5-785e6d505c0a" containerID="f56e856f839ebd36b3dc62c68002d7d85f952bfa98eff170165de03d9c8a342c" exitCode=143 Feb 24 03:13:00 crc kubenswrapper[4923]: I0224 03:13:00.659513 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"218da1a6-d1a3-4c71-8fe5-785e6d505c0a","Type":"ContainerDied","Data":"e1c6d1c25b04393f5d91675018ce2fc029cbadbd5abee9a2237e37e90d784a7c"} Feb 24 03:13:00 crc kubenswrapper[4923]: I0224 03:13:00.659857 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"218da1a6-d1a3-4c71-8fe5-785e6d505c0a","Type":"ContainerDied","Data":"f56e856f839ebd36b3dc62c68002d7d85f952bfa98eff170165de03d9c8a342c"} Feb 24 03:13:00 crc kubenswrapper[4923]: I0224 03:13:00.663275 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4b0f2318-6095-4c1c-8f68-3e897f966ca2","Type":"ContainerStarted","Data":"dabc68181fe71c00b9a6391131245ef1fc2d6e599fd51c465fb265fa9ee34c99"} Feb 24 03:13:00 crc kubenswrapper[4923]: I0224 03:13:00.663554 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4b0f2318-6095-4c1c-8f68-3e897f966ca2" containerName="glance-httpd" containerID="cri-o://dabc68181fe71c00b9a6391131245ef1fc2d6e599fd51c465fb265fa9ee34c99" gracePeriod=30 Feb 24 03:13:00 crc kubenswrapper[4923]: I0224 03:13:00.663560 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4b0f2318-6095-4c1c-8f68-3e897f966ca2" containerName="glance-log" containerID="cri-o://6a91a07ccc337d5c77ccf539943b73b40e13e07c932e9de47a7a76ba7c316a93" gracePeriod=30 Feb 24 03:13:00 crc kubenswrapper[4923]: I0224 03:13:00.690205 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.690187023 podStartE2EDuration="5.690187023s" podCreationTimestamp="2026-02-24 03:12:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:13:00.685000657 +0000 UTC m=+1104.702071480" watchObservedRunningTime="2026-02-24 03:13:00.690187023 +0000 UTC m=+1104.707257846" Feb 24 03:13:01 crc kubenswrapper[4923]: I0224 03:13:01.693508 4923 generic.go:334] "Generic (PLEG): container finished" podID="a06236b1-dc62-4def-877d-65184413973e" containerID="5a85c6672f1a475a0cbaf700fa58bad085c139f5bb8fd40c166325f0e40fa6b0" exitCode=0 Feb 24 03:13:01 crc kubenswrapper[4923]: I0224 03:13:01.693650 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9thl9" event={"ID":"a06236b1-dc62-4def-877d-65184413973e","Type":"ContainerDied","Data":"5a85c6672f1a475a0cbaf700fa58bad085c139f5bb8fd40c166325f0e40fa6b0"} Feb 24 03:13:01 crc kubenswrapper[4923]: I0224 03:13:01.698588 4923 generic.go:334] "Generic (PLEG): container finished" podID="4b0f2318-6095-4c1c-8f68-3e897f966ca2" containerID="dabc68181fe71c00b9a6391131245ef1fc2d6e599fd51c465fb265fa9ee34c99" exitCode=0 Feb 24 03:13:01 crc kubenswrapper[4923]: I0224 03:13:01.698612 4923 generic.go:334] "Generic (PLEG): container finished" podID="4b0f2318-6095-4c1c-8f68-3e897f966ca2" containerID="6a91a07ccc337d5c77ccf539943b73b40e13e07c932e9de47a7a76ba7c316a93" exitCode=143 Feb 24 03:13:01 crc kubenswrapper[4923]: I0224 03:13:01.698631 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4b0f2318-6095-4c1c-8f68-3e897f966ca2","Type":"ContainerDied","Data":"dabc68181fe71c00b9a6391131245ef1fc2d6e599fd51c465fb265fa9ee34c99"} Feb 24 03:13:01 crc kubenswrapper[4923]: I0224 03:13:01.698650 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4b0f2318-6095-4c1c-8f68-3e897f966ca2","Type":"ContainerDied","Data":"6a91a07ccc337d5c77ccf539943b73b40e13e07c932e9de47a7a76ba7c316a93"} Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.402488 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75cfdb654f-g2wdh"] Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.431738 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7fb8677dd-w8wrp"] Feb 24 03:13:03 crc kubenswrapper[4923]: E0224 03:13:03.432345 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19a85a5c-3407-4f5c-bbe7-72550f09bdf0" containerName="init" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.432437 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="19a85a5c-3407-4f5c-bbe7-72550f09bdf0" containerName="init" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.432638 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="19a85a5c-3407-4f5c-bbe7-72550f09bdf0" containerName="init" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.433768 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.443793 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.466728 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7fb8677dd-w8wrp"] Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.486660 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-575b899b65-wzw2v"] Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.507752 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6dcbd8cd94-497ns"] Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.509322 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.527766 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6dcbd8cd94-497ns"] Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.607389 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/260b26fd-552c-4dbb-b181-d423dbd57de2-scripts\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.607472 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6whmk\" (UniqueName: \"kubernetes.io/projected/260b26fd-552c-4dbb-b181-d423dbd57de2-kube-api-access-6whmk\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.607495 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/260b26fd-552c-4dbb-b181-d423dbd57de2-horizon-secret-key\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.607526 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/260b26fd-552c-4dbb-b181-d423dbd57de2-logs\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.607542 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260b26fd-552c-4dbb-b181-d423dbd57de2-combined-ca-bundle\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.607584 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/260b26fd-552c-4dbb-b181-d423dbd57de2-config-data\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.607630 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/260b26fd-552c-4dbb-b181-d423dbd57de2-horizon-tls-certs\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.708930 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/260b26fd-552c-4dbb-b181-d423dbd57de2-scripts\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.709935 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/260b26fd-552c-4dbb-b181-d423dbd57de2-scripts\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.710010 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cad919b-bb41-4c17-a13a-01831e715fd9-logs\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.710138 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctvpb\" (UniqueName: \"kubernetes.io/projected/3cad919b-bb41-4c17-a13a-01831e715fd9-kube-api-access-ctvpb\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.710207 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6whmk\" (UniqueName: \"kubernetes.io/projected/260b26fd-552c-4dbb-b181-d423dbd57de2-kube-api-access-6whmk\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.710236 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/260b26fd-552c-4dbb-b181-d423dbd57de2-horizon-secret-key\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.710275 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cad919b-bb41-4c17-a13a-01831e715fd9-scripts\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.711728 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/260b26fd-552c-4dbb-b181-d423dbd57de2-logs\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.711754 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260b26fd-552c-4dbb-b181-d423dbd57de2-combined-ca-bundle\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.711785 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3cad919b-bb41-4c17-a13a-01831e715fd9-config-data\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.711803 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cad919b-bb41-4c17-a13a-01831e715fd9-combined-ca-bundle\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.711834 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/260b26fd-552c-4dbb-b181-d423dbd57de2-config-data\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.711887 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3cad919b-bb41-4c17-a13a-01831e715fd9-horizon-secret-key\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.712145 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/260b26fd-552c-4dbb-b181-d423dbd57de2-logs\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.712975 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cad919b-bb41-4c17-a13a-01831e715fd9-horizon-tls-certs\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.713010 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/260b26fd-552c-4dbb-b181-d423dbd57de2-config-data\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.713389 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/260b26fd-552c-4dbb-b181-d423dbd57de2-horizon-tls-certs\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.717929 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/260b26fd-552c-4dbb-b181-d423dbd57de2-horizon-secret-key\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.718149 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260b26fd-552c-4dbb-b181-d423dbd57de2-combined-ca-bundle\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.722760 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/260b26fd-552c-4dbb-b181-d423dbd57de2-horizon-tls-certs\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.724972 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6whmk\" (UniqueName: \"kubernetes.io/projected/260b26fd-552c-4dbb-b181-d423dbd57de2-kube-api-access-6whmk\") pod \"horizon-7fb8677dd-w8wrp\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.767050 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.815436 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3cad919b-bb41-4c17-a13a-01831e715fd9-config-data\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.815478 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cad919b-bb41-4c17-a13a-01831e715fd9-combined-ca-bundle\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.815511 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3cad919b-bb41-4c17-a13a-01831e715fd9-horizon-secret-key\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.815557 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cad919b-bb41-4c17-a13a-01831e715fd9-horizon-tls-certs\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.815638 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cad919b-bb41-4c17-a13a-01831e715fd9-logs\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.815661 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctvpb\" (UniqueName: \"kubernetes.io/projected/3cad919b-bb41-4c17-a13a-01831e715fd9-kube-api-access-ctvpb\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.815712 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cad919b-bb41-4c17-a13a-01831e715fd9-scripts\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.816449 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cad919b-bb41-4c17-a13a-01831e715fd9-scripts\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.817528 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3cad919b-bb41-4c17-a13a-01831e715fd9-config-data\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.819367 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cad919b-bb41-4c17-a13a-01831e715fd9-logs\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.821846 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3cad919b-bb41-4c17-a13a-01831e715fd9-horizon-secret-key\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.825234 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cad919b-bb41-4c17-a13a-01831e715fd9-combined-ca-bundle\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.825790 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cad919b-bb41-4c17-a13a-01831e715fd9-horizon-tls-certs\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.833975 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctvpb\" (UniqueName: \"kubernetes.io/projected/3cad919b-bb41-4c17-a13a-01831e715fd9-kube-api-access-ctvpb\") pod \"horizon-6dcbd8cd94-497ns\" (UID: \"3cad919b-bb41-4c17-a13a-01831e715fd9\") " pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:03 crc kubenswrapper[4923]: I0224 03:13:03.834418 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:05 crc kubenswrapper[4923]: I0224 03:13:05.774452 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:13:05 crc kubenswrapper[4923]: I0224 03:13:05.865391 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-frjd7"] Feb 24 03:13:05 crc kubenswrapper[4923]: I0224 03:13:05.865668 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" podUID="9c060758-83bf-4a7c-90b4-13c2a20194b3" containerName="dnsmasq-dns" containerID="cri-o://0f6191a8718c2bb6f45afbfa65d8a1e10252cc87287859ec22708178def8f5a9" gracePeriod=10 Feb 24 03:13:06 crc kubenswrapper[4923]: I0224 03:13:06.781593 4923 generic.go:334] "Generic (PLEG): container finished" podID="9c060758-83bf-4a7c-90b4-13c2a20194b3" containerID="0f6191a8718c2bb6f45afbfa65d8a1e10252cc87287859ec22708178def8f5a9" exitCode=0 Feb 24 03:13:06 crc kubenswrapper[4923]: I0224 03:13:06.781638 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" event={"ID":"9c060758-83bf-4a7c-90b4-13c2a20194b3","Type":"ContainerDied","Data":"0f6191a8718c2bb6f45afbfa65d8a1e10252cc87287859ec22708178def8f5a9"} Feb 24 03:13:07 crc kubenswrapper[4923]: I0224 03:13:07.924717 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" podUID="9c060758-83bf-4a7c-90b4-13c2a20194b3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: connect: connection refused" Feb 24 03:13:09 crc kubenswrapper[4923]: E0224 03:13:09.488593 4923 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 24 03:13:09 crc kubenswrapper[4923]: E0224 03:13:09.489186 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7fh598h5dfh5ffh5cfh559hd7h557h647h657h56bh87hb7h5fdh75h5f6h666h56dh677h5dfh5dbh66ch694h5ffhd4h5h6h695h7bh584h657h56dq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tdwmk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1710156b-5155-4340-8013-2f9e3d68be35): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.270471 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.354430 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.355519 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-internal-tls-certs\") pod \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.355614 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-logs\") pod \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.355671 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvjkc\" (UniqueName: \"kubernetes.io/projected/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-kube-api-access-jvjkc\") pod \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.355927 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-logs" (OuterVolumeSpecName: "logs") pod "218da1a6-d1a3-4c71-8fe5-785e6d505c0a" (UID: "218da1a6-d1a3-4c71-8fe5-785e6d505c0a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.356328 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-httpd-run\") pod \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.356356 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-scripts\") pod \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.356379 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-config-data\") pod \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.356457 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-combined-ca-bundle\") pod \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\" (UID: \"218da1a6-d1a3-4c71-8fe5-785e6d505c0a\") " Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.356578 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "218da1a6-d1a3-4c71-8fe5-785e6d505c0a" (UID: "218da1a6-d1a3-4c71-8fe5-785e6d505c0a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.357154 4923 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-logs\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.357182 4923 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.362578 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-scripts" (OuterVolumeSpecName: "scripts") pod "218da1a6-d1a3-4c71-8fe5-785e6d505c0a" (UID: "218da1a6-d1a3-4c71-8fe5-785e6d505c0a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.363004 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-kube-api-access-jvjkc" (OuterVolumeSpecName: "kube-api-access-jvjkc") pod "218da1a6-d1a3-4c71-8fe5-785e6d505c0a" (UID: "218da1a6-d1a3-4c71-8fe5-785e6d505c0a"). InnerVolumeSpecName "kube-api-access-jvjkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.366575 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "218da1a6-d1a3-4c71-8fe5-785e6d505c0a" (UID: "218da1a6-d1a3-4c71-8fe5-785e6d505c0a"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.400557 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "218da1a6-d1a3-4c71-8fe5-785e6d505c0a" (UID: "218da1a6-d1a3-4c71-8fe5-785e6d505c0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.417328 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "218da1a6-d1a3-4c71-8fe5-785e6d505c0a" (UID: "218da1a6-d1a3-4c71-8fe5-785e6d505c0a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.441808 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-config-data" (OuterVolumeSpecName: "config-data") pod "218da1a6-d1a3-4c71-8fe5-785e6d505c0a" (UID: "218da1a6-d1a3-4c71-8fe5-785e6d505c0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.459066 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.459118 4923 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.459128 4923 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.459138 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvjkc\" (UniqueName: \"kubernetes.io/projected/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-kube-api-access-jvjkc\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.459150 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.459158 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/218da1a6-d1a3-4c71-8fe5-785e6d505c0a-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.480767 4923 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.560640 4923 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.820699 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"218da1a6-d1a3-4c71-8fe5-785e6d505c0a","Type":"ContainerDied","Data":"f2095ccf935f47bd865d3aa8dece5be6fe2d39cf6b5e46ed0b9f0fcf34968384"} Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.820780 4923 scope.go:117] "RemoveContainer" containerID="e1c6d1c25b04393f5d91675018ce2fc029cbadbd5abee9a2237e37e90d784a7c" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.820995 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.850113 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.861062 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.879936 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 24 03:13:11 crc kubenswrapper[4923]: E0224 03:13:11.880453 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218da1a6-d1a3-4c71-8fe5-785e6d505c0a" containerName="glance-httpd" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.880475 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="218da1a6-d1a3-4c71-8fe5-785e6d505c0a" containerName="glance-httpd" Feb 24 03:13:11 crc kubenswrapper[4923]: E0224 03:13:11.880495 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218da1a6-d1a3-4c71-8fe5-785e6d505c0a" containerName="glance-log" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.880504 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="218da1a6-d1a3-4c71-8fe5-785e6d505c0a" containerName="glance-log" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.880742 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="218da1a6-d1a3-4c71-8fe5-785e6d505c0a" containerName="glance-log" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.880766 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="218da1a6-d1a3-4c71-8fe5-785e6d505c0a" containerName="glance-httpd" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.881952 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.884558 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.884749 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 24 03:13:11 crc kubenswrapper[4923]: I0224 03:13:11.888228 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.068168 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.068371 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb08edd9-6041-489a-8713-8bc00d88527c-logs\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.068564 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb08edd9-6041-489a-8713-8bc00d88527c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.068638 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxj7g\" (UniqueName: \"kubernetes.io/projected/cb08edd9-6041-489a-8713-8bc00d88527c-kube-api-access-kxj7g\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.068694 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.068721 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.068831 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.068974 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.170068 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.170184 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb08edd9-6041-489a-8713-8bc00d88527c-logs\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.170214 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb08edd9-6041-489a-8713-8bc00d88527c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.170242 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxj7g\" (UniqueName: \"kubernetes.io/projected/cb08edd9-6041-489a-8713-8bc00d88527c-kube-api-access-kxj7g\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.170265 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.170288 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.170353 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.170664 4923 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.170851 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.170851 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb08edd9-6041-489a-8713-8bc00d88527c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.170946 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb08edd9-6041-489a-8713-8bc00d88527c-logs\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.176323 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.176505 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.177614 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.177747 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.188279 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxj7g\" (UniqueName: \"kubernetes.io/projected/cb08edd9-6041-489a-8713-8bc00d88527c-kube-api-access-kxj7g\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.198231 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.207264 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 24 03:13:12 crc kubenswrapper[4923]: I0224 03:13:12.925840 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" podUID="9c060758-83bf-4a7c-90b4-13c2a20194b3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: connect: connection refused" Feb 24 03:13:13 crc kubenswrapper[4923]: I0224 03:13:13.722400 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="218da1a6-d1a3-4c71-8fe5-785e6d505c0a" path="/var/lib/kubelet/pods/218da1a6-d1a3-4c71-8fe5-785e6d505c0a/volumes" Feb 24 03:13:16 crc kubenswrapper[4923]: I0224 03:13:16.876551 4923 generic.go:334] "Generic (PLEG): container finished" podID="4673b1fa-d73c-48c9-b2fd-d0f7afe97efd" containerID="02795f14259e3be3db9608da5152c23a7606f40060e396cfffc3b567c9ffc3ec" exitCode=0 Feb 24 03:13:16 crc kubenswrapper[4923]: I0224 03:13:16.876635 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-brpzr" event={"ID":"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd","Type":"ContainerDied","Data":"02795f14259e3be3db9608da5152c23a7606f40060e396cfffc3b567c9ffc3ec"} Feb 24 03:13:17 crc kubenswrapper[4923]: I0224 03:13:17.926120 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" podUID="9c060758-83bf-4a7c-90b4-13c2a20194b3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: connect: connection refused" Feb 24 03:13:17 crc kubenswrapper[4923]: I0224 03:13:17.926508 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:19.916337 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:19.916679 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.519487 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.642782 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b0f2318-6095-4c1c-8f68-3e897f966ca2-logs\") pod \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.643264 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-public-tls-certs\") pod \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.643318 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-config-data\") pod \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.643425 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b0f2318-6095-4c1c-8f68-3e897f966ca2-httpd-run\") pod \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.643447 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-scripts\") pod \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.643465 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.643488 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrnx8\" (UniqueName: \"kubernetes.io/projected/4b0f2318-6095-4c1c-8f68-3e897f966ca2-kube-api-access-qrnx8\") pod \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.643552 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-combined-ca-bundle\") pod \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\" (UID: \"4b0f2318-6095-4c1c-8f68-3e897f966ca2\") " Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.643474 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b0f2318-6095-4c1c-8f68-3e897f966ca2-logs" (OuterVolumeSpecName: "logs") pod "4b0f2318-6095-4c1c-8f68-3e897f966ca2" (UID: "4b0f2318-6095-4c1c-8f68-3e897f966ca2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.643648 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b0f2318-6095-4c1c-8f68-3e897f966ca2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4b0f2318-6095-4c1c-8f68-3e897f966ca2" (UID: "4b0f2318-6095-4c1c-8f68-3e897f966ca2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.650475 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-scripts" (OuterVolumeSpecName: "scripts") pod "4b0f2318-6095-4c1c-8f68-3e897f966ca2" (UID: "4b0f2318-6095-4c1c-8f68-3e897f966ca2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.676385 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "4b0f2318-6095-4c1c-8f68-3e897f966ca2" (UID: "4b0f2318-6095-4c1c-8f68-3e897f966ca2"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.677251 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b0f2318-6095-4c1c-8f68-3e897f966ca2-kube-api-access-qrnx8" (OuterVolumeSpecName: "kube-api-access-qrnx8") pod "4b0f2318-6095-4c1c-8f68-3e897f966ca2" (UID: "4b0f2318-6095-4c1c-8f68-3e897f966ca2"). InnerVolumeSpecName "kube-api-access-qrnx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.682552 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b0f2318-6095-4c1c-8f68-3e897f966ca2" (UID: "4b0f2318-6095-4c1c-8f68-3e897f966ca2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.702798 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4b0f2318-6095-4c1c-8f68-3e897f966ca2" (UID: "4b0f2318-6095-4c1c-8f68-3e897f966ca2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.713461 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-config-data" (OuterVolumeSpecName: "config-data") pod "4b0f2318-6095-4c1c-8f68-3e897f966ca2" (UID: "4b0f2318-6095-4c1c-8f68-3e897f966ca2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.745881 4923 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b0f2318-6095-4c1c-8f68-3e897f966ca2-logs\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.745910 4923 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.745925 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.745937 4923 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b0f2318-6095-4c1c-8f68-3e897f966ca2-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.745948 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.745981 4923 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.745997 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrnx8\" (UniqueName: \"kubernetes.io/projected/4b0f2318-6095-4c1c-8f68-3e897f966ca2-kube-api-access-qrnx8\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.746009 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b0f2318-6095-4c1c-8f68-3e897f966ca2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.773271 4923 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.847817 4923 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.920285 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4b0f2318-6095-4c1c-8f68-3e897f966ca2","Type":"ContainerDied","Data":"8a0c3be26b5c7bbe2013bc6b7ff044b5074acb7da3f65736104cbfbf5730ba96"} Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.920396 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.957619 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.964443 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.990031 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 24 03:13:20 crc kubenswrapper[4923]: E0224 03:13:20.990491 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b0f2318-6095-4c1c-8f68-3e897f966ca2" containerName="glance-log" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.990510 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b0f2318-6095-4c1c-8f68-3e897f966ca2" containerName="glance-log" Feb 24 03:13:20 crc kubenswrapper[4923]: E0224 03:13:20.990548 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b0f2318-6095-4c1c-8f68-3e897f966ca2" containerName="glance-httpd" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.990557 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b0f2318-6095-4c1c-8f68-3e897f966ca2" containerName="glance-httpd" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.990764 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b0f2318-6095-4c1c-8f68-3e897f966ca2" containerName="glance-httpd" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.990789 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b0f2318-6095-4c1c-8f68-3e897f966ca2" containerName="glance-log" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.992814 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.995210 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 24 03:13:20 crc kubenswrapper[4923]: I0224 03:13:20.995491 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.002654 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.153932 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-config-data\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.153983 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.154009 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.154068 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8517902-92e0-4ee1-8765-9d7331ac90f4-logs\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.154083 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8517902-92e0-4ee1-8765-9d7331ac90f4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.154113 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mjds\" (UniqueName: \"kubernetes.io/projected/c8517902-92e0-4ee1-8765-9d7331ac90f4-kube-api-access-2mjds\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.154139 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.154171 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-scripts\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.255855 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.255959 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-scripts\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.256048 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-config-data\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.256099 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.256131 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.256255 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8517902-92e0-4ee1-8765-9d7331ac90f4-logs\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.256281 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8517902-92e0-4ee1-8765-9d7331ac90f4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.256346 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mjds\" (UniqueName: \"kubernetes.io/projected/c8517902-92e0-4ee1-8765-9d7331ac90f4-kube-api-access-2mjds\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.256536 4923 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.257464 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8517902-92e0-4ee1-8765-9d7331ac90f4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.258940 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8517902-92e0-4ee1-8765-9d7331ac90f4-logs\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.261055 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-scripts\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.261102 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.262606 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-config-data\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.265166 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.274698 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mjds\" (UniqueName: \"kubernetes.io/projected/c8517902-92e0-4ee1-8765-9d7331ac90f4-kube-api-access-2mjds\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.281181 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.312432 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 24 03:13:21 crc kubenswrapper[4923]: E0224 03:13:21.364430 4923 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 24 03:13:21 crc kubenswrapper[4923]: E0224 03:13:21.364591 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f9skx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-rvlxb_openstack(29f64c48-5ed4-431c-8636-702a8abf02b5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 24 03:13:21 crc kubenswrapper[4923]: E0224 03:13:21.365778 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-rvlxb" podUID="29f64c48-5ed4-431c-8636-702a8abf02b5" Feb 24 03:13:21 crc kubenswrapper[4923]: I0224 03:13:21.724729 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b0f2318-6095-4c1c-8f68-3e897f966ca2" path="/var/lib/kubelet/pods/4b0f2318-6095-4c1c-8f68-3e897f966ca2/volumes" Feb 24 03:13:21 crc kubenswrapper[4923]: E0224 03:13:21.931011 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-rvlxb" podUID="29f64c48-5ed4-431c-8636-702a8abf02b5" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.326138 4923 scope.go:117] "RemoveContainer" containerID="f56e856f839ebd36b3dc62c68002d7d85f952bfa98eff170165de03d9c8a342c" Feb 24 03:13:22 crc kubenswrapper[4923]: E0224 03:13:22.352940 4923 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 24 03:13:22 crc kubenswrapper[4923]: E0224 03:13:22.353101 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-js5tl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-gdl2d_openstack(8ad7cefc-c3bb-48ff-ab05-0fe707823e84): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 24 03:13:22 crc kubenswrapper[4923]: E0224 03:13:22.354251 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-gdl2d" podUID="8ad7cefc-c3bb-48ff-ab05-0fe707823e84" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.464151 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-brpzr" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.481615 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.490802 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.576654 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnxdj\" (UniqueName: \"kubernetes.io/projected/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd-kube-api-access-jnxdj\") pod \"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd\" (UID: \"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd\") " Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.576766 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd-combined-ca-bundle\") pod \"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd\" (UID: \"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd\") " Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.576912 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-dns-swift-storage-0\") pod \"9c060758-83bf-4a7c-90b4-13c2a20194b3\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.577448 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-config-data\") pod \"a06236b1-dc62-4def-877d-65184413973e\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.577505 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wk7v\" (UniqueName: \"kubernetes.io/projected/9c060758-83bf-4a7c-90b4-13c2a20194b3-kube-api-access-4wk7v\") pod \"9c060758-83bf-4a7c-90b4-13c2a20194b3\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.577558 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-config\") pod \"9c060758-83bf-4a7c-90b4-13c2a20194b3\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.577611 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-scripts\") pod \"a06236b1-dc62-4def-877d-65184413973e\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.577645 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-fernet-keys\") pod \"a06236b1-dc62-4def-877d-65184413973e\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.577702 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-ovsdbserver-nb\") pod \"9c060758-83bf-4a7c-90b4-13c2a20194b3\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.577751 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-combined-ca-bundle\") pod \"a06236b1-dc62-4def-877d-65184413973e\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.577780 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-ovsdbserver-sb\") pod \"9c060758-83bf-4a7c-90b4-13c2a20194b3\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.577808 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-dns-svc\") pod \"9c060758-83bf-4a7c-90b4-13c2a20194b3\" (UID: \"9c060758-83bf-4a7c-90b4-13c2a20194b3\") " Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.577848 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd-config\") pod \"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd\" (UID: \"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd\") " Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.577883 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-credential-keys\") pod \"a06236b1-dc62-4def-877d-65184413973e\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.577910 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6tcg\" (UniqueName: \"kubernetes.io/projected/a06236b1-dc62-4def-877d-65184413973e-kube-api-access-q6tcg\") pod \"a06236b1-dc62-4def-877d-65184413973e\" (UID: \"a06236b1-dc62-4def-877d-65184413973e\") " Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.581126 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd-kube-api-access-jnxdj" (OuterVolumeSpecName: "kube-api-access-jnxdj") pod "4673b1fa-d73c-48c9-b2fd-d0f7afe97efd" (UID: "4673b1fa-d73c-48c9-b2fd-d0f7afe97efd"). InnerVolumeSpecName "kube-api-access-jnxdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.584706 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-scripts" (OuterVolumeSpecName: "scripts") pod "a06236b1-dc62-4def-877d-65184413973e" (UID: "a06236b1-dc62-4def-877d-65184413973e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.585226 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a06236b1-dc62-4def-877d-65184413973e" (UID: "a06236b1-dc62-4def-877d-65184413973e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.587886 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a06236b1-dc62-4def-877d-65184413973e" (UID: "a06236b1-dc62-4def-877d-65184413973e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.588005 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a06236b1-dc62-4def-877d-65184413973e-kube-api-access-q6tcg" (OuterVolumeSpecName: "kube-api-access-q6tcg") pod "a06236b1-dc62-4def-877d-65184413973e" (UID: "a06236b1-dc62-4def-877d-65184413973e"). InnerVolumeSpecName "kube-api-access-q6tcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.604043 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c060758-83bf-4a7c-90b4-13c2a20194b3-kube-api-access-4wk7v" (OuterVolumeSpecName: "kube-api-access-4wk7v") pod "9c060758-83bf-4a7c-90b4-13c2a20194b3" (UID: "9c060758-83bf-4a7c-90b4-13c2a20194b3"). InnerVolumeSpecName "kube-api-access-4wk7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.612771 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4673b1fa-d73c-48c9-b2fd-d0f7afe97efd" (UID: "4673b1fa-d73c-48c9-b2fd-d0f7afe97efd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.617895 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd-config" (OuterVolumeSpecName: "config") pod "4673b1fa-d73c-48c9-b2fd-d0f7afe97efd" (UID: "4673b1fa-d73c-48c9-b2fd-d0f7afe97efd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.625016 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-config-data" (OuterVolumeSpecName: "config-data") pod "a06236b1-dc62-4def-877d-65184413973e" (UID: "a06236b1-dc62-4def-877d-65184413973e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.629469 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a06236b1-dc62-4def-877d-65184413973e" (UID: "a06236b1-dc62-4def-877d-65184413973e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.641253 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9c060758-83bf-4a7c-90b4-13c2a20194b3" (UID: "9c060758-83bf-4a7c-90b4-13c2a20194b3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.643996 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c060758-83bf-4a7c-90b4-13c2a20194b3" (UID: "9c060758-83bf-4a7c-90b4-13c2a20194b3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.659538 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9c060758-83bf-4a7c-90b4-13c2a20194b3" (UID: "9c060758-83bf-4a7c-90b4-13c2a20194b3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.669939 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-config" (OuterVolumeSpecName: "config") pod "9c060758-83bf-4a7c-90b4-13c2a20194b3" (UID: "9c060758-83bf-4a7c-90b4-13c2a20194b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.670424 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9c060758-83bf-4a7c-90b4-13c2a20194b3" (UID: "9c060758-83bf-4a7c-90b4-13c2a20194b3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.679409 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.679446 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.679468 4923 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.679481 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.679492 4923 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.679504 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6tcg\" (UniqueName: \"kubernetes.io/projected/a06236b1-dc62-4def-877d-65184413973e-kube-api-access-q6tcg\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.679517 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnxdj\" (UniqueName: \"kubernetes.io/projected/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd-kube-api-access-jnxdj\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.679528 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.679538 4923 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.679549 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.679559 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wk7v\" (UniqueName: \"kubernetes.io/projected/9c060758-83bf-4a7c-90b4-13c2a20194b3-kube-api-access-4wk7v\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.679569 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.679578 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.679587 4923 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a06236b1-dc62-4def-877d-65184413973e-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.679597 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c060758-83bf-4a7c-90b4-13c2a20194b3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.764823 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7fb8677dd-w8wrp"] Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.805394 4923 scope.go:117] "RemoveContainer" containerID="dabc68181fe71c00b9a6391131245ef1fc2d6e599fd51c465fb265fa9ee34c99" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.855107 4923 scope.go:117] "RemoveContainer" containerID="6a91a07ccc337d5c77ccf539943b73b40e13e07c932e9de47a7a76ba7c316a93" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.942774 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.943073 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-frjd7" event={"ID":"9c060758-83bf-4a7c-90b4-13c2a20194b3","Type":"ContainerDied","Data":"700c6f1f5762c48841fb04db345c7950ccb585a61dd1e37a5659ac0aa0b5334c"} Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.943116 4923 scope.go:117] "RemoveContainer" containerID="0f6191a8718c2bb6f45afbfa65d8a1e10252cc87287859ec22708178def8f5a9" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.947406 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-brpzr" event={"ID":"4673b1fa-d73c-48c9-b2fd-d0f7afe97efd","Type":"ContainerDied","Data":"c2fc2120b22c66907835119e8581c2231b1f52a807d40c234e77798389224409"} Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.947434 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2fc2120b22c66907835119e8581c2231b1f52a807d40c234e77798389224409" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.947485 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-brpzr" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.957269 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb8677dd-w8wrp" event={"ID":"260b26fd-552c-4dbb-b181-d423dbd57de2","Type":"ContainerStarted","Data":"2bf5def33882cf573f78153ea23649dc9d2a81017a1bba44a1fe03a91b9eba3d"} Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.959329 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9thl9" event={"ID":"a06236b1-dc62-4def-877d-65184413973e","Type":"ContainerDied","Data":"a8db3ab4bafe04a70349384b887a95f78477a3158d01b6a459147eb8050d2314"} Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.959374 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8db3ab4bafe04a70349384b887a95f78477a3158d01b6a459147eb8050d2314" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.959448 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9thl9" Feb 24 03:13:22 crc kubenswrapper[4923]: E0224 03:13:22.973522 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-gdl2d" podUID="8ad7cefc-c3bb-48ff-ab05-0fe707823e84" Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.985603 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-frjd7"] Feb 24 03:13:22 crc kubenswrapper[4923]: I0224 03:13:22.996534 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-frjd7"] Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.017029 4923 scope.go:117] "RemoveContainer" containerID="dc78fc3ab389062e12dbb82c0becb861e0e72f87f4c96426437740eb849e38ee" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.178762 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6dcbd8cd94-497ns"] Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.422942 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.601696 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-9thl9"] Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.638642 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-9thl9"] Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.679626 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bqtmp"] Feb 24 03:13:23 crc kubenswrapper[4923]: E0224 03:13:23.680006 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c060758-83bf-4a7c-90b4-13c2a20194b3" containerName="init" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.680023 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c060758-83bf-4a7c-90b4-13c2a20194b3" containerName="init" Feb 24 03:13:23 crc kubenswrapper[4923]: E0224 03:13:23.680033 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a06236b1-dc62-4def-877d-65184413973e" containerName="keystone-bootstrap" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.680041 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="a06236b1-dc62-4def-877d-65184413973e" containerName="keystone-bootstrap" Feb 24 03:13:23 crc kubenswrapper[4923]: E0224 03:13:23.680054 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c060758-83bf-4a7c-90b4-13c2a20194b3" containerName="dnsmasq-dns" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.680061 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c060758-83bf-4a7c-90b4-13c2a20194b3" containerName="dnsmasq-dns" Feb 24 03:13:23 crc kubenswrapper[4923]: E0224 03:13:23.680092 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4673b1fa-d73c-48c9-b2fd-d0f7afe97efd" containerName="neutron-db-sync" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.680101 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4673b1fa-d73c-48c9-b2fd-d0f7afe97efd" containerName="neutron-db-sync" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.680256 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="a06236b1-dc62-4def-877d-65184413973e" containerName="keystone-bootstrap" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.680275 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="4673b1fa-d73c-48c9-b2fd-d0f7afe97efd" containerName="neutron-db-sync" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.680287 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c060758-83bf-4a7c-90b4-13c2a20194b3" containerName="dnsmasq-dns" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.680878 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.693546 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-26zv5" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.693734 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.693891 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.693996 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.694496 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.742271 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c060758-83bf-4a7c-90b4-13c2a20194b3" path="/var/lib/kubelet/pods/9c060758-83bf-4a7c-90b4-13c2a20194b3/volumes" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.742852 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a06236b1-dc62-4def-877d-65184413973e" path="/var/lib/kubelet/pods/a06236b1-dc62-4def-877d-65184413973e/volumes" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.744940 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-zxvbv"] Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.746698 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.753537 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bqtmp"] Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.766502 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-zxvbv"] Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.779285 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-57f869f9f6-f2wpq"] Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.780859 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.786579 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.786886 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.787423 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.787532 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4m94q" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.796474 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57f869f9f6-f2wpq"] Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.801085 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-combined-ca-bundle\") pod \"keystone-bootstrap-bqtmp\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.803074 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-scripts\") pod \"keystone-bootstrap-bqtmp\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.803169 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-config-data\") pod \"keystone-bootstrap-bqtmp\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.803339 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sc5v\" (UniqueName: \"kubernetes.io/projected/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-kube-api-access-9sc5v\") pod \"keystone-bootstrap-bqtmp\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.803369 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-fernet-keys\") pod \"keystone-bootstrap-bqtmp\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.803756 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-credential-keys\") pod \"keystone-bootstrap-bqtmp\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.906151 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-config-data\") pod \"keystone-bootstrap-bqtmp\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.906193 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-combined-ca-bundle\") pod \"neutron-57f869f9f6-f2wpq\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.906221 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-config\") pod \"dnsmasq-dns-6b7b667979-zxvbv\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.906252 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jjws\" (UniqueName: \"kubernetes.io/projected/061ab736-e68d-4053-b8d3-13ab8220ef22-kube-api-access-2jjws\") pod \"neutron-57f869f9f6-f2wpq\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.906283 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sc5v\" (UniqueName: \"kubernetes.io/projected/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-kube-api-access-9sc5v\") pod \"keystone-bootstrap-bqtmp\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.906317 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-zxvbv\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.906333 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-fernet-keys\") pod \"keystone-bootstrap-bqtmp\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.906369 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-httpd-config\") pod \"neutron-57f869f9f6-f2wpq\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.906387 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-ovndb-tls-certs\") pod \"neutron-57f869f9f6-f2wpq\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.906416 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-credential-keys\") pod \"keystone-bootstrap-bqtmp\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.906436 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-dns-svc\") pod \"dnsmasq-dns-6b7b667979-zxvbv\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.906455 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-zxvbv\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.906470 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ff4c\" (UniqueName: \"kubernetes.io/projected/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-kube-api-access-7ff4c\") pod \"dnsmasq-dns-6b7b667979-zxvbv\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.906485 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-config\") pod \"neutron-57f869f9f6-f2wpq\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.906532 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-combined-ca-bundle\") pod \"keystone-bootstrap-bqtmp\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.906552 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-scripts\") pod \"keystone-bootstrap-bqtmp\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.906584 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-zxvbv\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.911749 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-scripts\") pod \"keystone-bootstrap-bqtmp\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.914776 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-credential-keys\") pod \"keystone-bootstrap-bqtmp\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.920891 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-fernet-keys\") pod \"keystone-bootstrap-bqtmp\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.923806 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-combined-ca-bundle\") pod \"keystone-bootstrap-bqtmp\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.925120 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-config-data\") pod \"keystone-bootstrap-bqtmp\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.925816 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sc5v\" (UniqueName: \"kubernetes.io/projected/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-kube-api-access-9sc5v\") pod \"keystone-bootstrap-bqtmp\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.981793 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bc6fbcd4c-xk5sx" event={"ID":"bea6c18a-0093-4b3b-b56b-323a86181da5","Type":"ContainerStarted","Data":"404a6ca207b461006a0893ba7093924758508651a5fced8f73110ec9546137d9"} Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.981838 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bc6fbcd4c-xk5sx" event={"ID":"bea6c18a-0093-4b3b-b56b-323a86181da5","Type":"ContainerStarted","Data":"5ca27549684660b24e8debe6a7249a3eaa72e7749c35ed72e9f3f97a1f9dbe4a"} Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.981887 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-bc6fbcd4c-xk5sx" podUID="bea6c18a-0093-4b3b-b56b-323a86181da5" containerName="horizon-log" containerID="cri-o://5ca27549684660b24e8debe6a7249a3eaa72e7749c35ed72e9f3f97a1f9dbe4a" gracePeriod=30 Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.982013 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-bc6fbcd4c-xk5sx" podUID="bea6c18a-0093-4b3b-b56b-323a86181da5" containerName="horizon" containerID="cri-o://404a6ca207b461006a0893ba7093924758508651a5fced8f73110ec9546137d9" gracePeriod=30 Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.990108 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb8677dd-w8wrp" event={"ID":"260b26fd-552c-4dbb-b181-d423dbd57de2","Type":"ContainerStarted","Data":"eb6dc2bd50631ac3c6bcb24200408fc8c381aff3618506834cab3ba070373112"} Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.990158 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb8677dd-w8wrp" event={"ID":"260b26fd-552c-4dbb-b181-d423dbd57de2","Type":"ContainerStarted","Data":"56c679cc8bc9a394faf66c327b9d9daede87630b6ecdfd96142d5fbe328abf7f"} Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.991834 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c8517902-92e0-4ee1-8765-9d7331ac90f4","Type":"ContainerStarted","Data":"9eca302f370a273f2dafdee277198a33467f2277ef7e66be7ff4825d38e07c2b"} Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.995516 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dcbd8cd94-497ns" event={"ID":"3cad919b-bb41-4c17-a13a-01831e715fd9","Type":"ContainerStarted","Data":"5a4cd0b5439f22bdbff2f50ae8d5a31e9170896b7d0a0c5cb61cb126b9556f5d"} Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.995562 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dcbd8cd94-497ns" event={"ID":"3cad919b-bb41-4c17-a13a-01831e715fd9","Type":"ContainerStarted","Data":"b48786ccc2d99f4294816dc490ef075888daf9ab084a347c0f9d0530153b5b2d"} Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.995571 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dcbd8cd94-497ns" event={"ID":"3cad919b-bb41-4c17-a13a-01831e715fd9","Type":"ContainerStarted","Data":"9228d610ca9755191d04976161d9f2bcc30792a4bef2294ecb25a54d60c465ea"} Feb 24 03:13:23 crc kubenswrapper[4923]: I0224 03:13:23.998402 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ds4gb" event={"ID":"d0d1f021-7b1a-491b-9dd5-90d6425bcde7","Type":"ContainerStarted","Data":"03e8bda9ede338d2a1a471ee37b9b128d3ab6960e598c9ffbd75a7b0af61f2fd"} Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.009713 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-config\") pod \"neutron-57f869f9f6-f2wpq\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.009843 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-zxvbv\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.009875 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-combined-ca-bundle\") pod \"neutron-57f869f9f6-f2wpq\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.009908 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-config\") pod \"dnsmasq-dns-6b7b667979-zxvbv\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.009945 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jjws\" (UniqueName: \"kubernetes.io/projected/061ab736-e68d-4053-b8d3-13ab8220ef22-kube-api-access-2jjws\") pod \"neutron-57f869f9f6-f2wpq\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.009988 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-zxvbv\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.010037 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-httpd-config\") pod \"neutron-57f869f9f6-f2wpq\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.010063 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-ovndb-tls-certs\") pod \"neutron-57f869f9f6-f2wpq\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.010107 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-dns-svc\") pod \"dnsmasq-dns-6b7b667979-zxvbv\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.010132 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ff4c\" (UniqueName: \"kubernetes.io/projected/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-kube-api-access-7ff4c\") pod \"dnsmasq-dns-6b7b667979-zxvbv\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.010159 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-zxvbv\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.011491 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-zxvbv\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.012143 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-zxvbv\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.018585 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-config\") pod \"neutron-57f869f9f6-f2wpq\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.019140 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-config\") pod \"dnsmasq-dns-6b7b667979-zxvbv\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.019776 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-httpd-config\") pod \"neutron-57f869f9f6-f2wpq\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.019922 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-zxvbv\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.022890 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-dns-svc\") pod \"dnsmasq-dns-6b7b667979-zxvbv\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.023001 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75cfdb654f-g2wdh" event={"ID":"059fc35a-7b75-46b7-86f3-7b05fb19c5de","Type":"ContainerStarted","Data":"77fb0ea252c58da0bef8058e4d65911ed0ade30d008df8f49089d4fe1b24883b"} Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.023038 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75cfdb654f-g2wdh" event={"ID":"059fc35a-7b75-46b7-86f3-7b05fb19c5de","Type":"ContainerStarted","Data":"e4619b9b56030cf5542cf1786d7a63703cdf7cd96c73f563f5151086476b6bd9"} Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.023146 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-75cfdb654f-g2wdh" podUID="059fc35a-7b75-46b7-86f3-7b05fb19c5de" containerName="horizon-log" containerID="cri-o://e4619b9b56030cf5542cf1786d7a63703cdf7cd96c73f563f5151086476b6bd9" gracePeriod=30 Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.023229 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-75cfdb654f-g2wdh" podUID="059fc35a-7b75-46b7-86f3-7b05fb19c5de" containerName="horizon" containerID="cri-o://77fb0ea252c58da0bef8058e4d65911ed0ade30d008df8f49089d4fe1b24883b" gracePeriod=30 Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.028120 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-combined-ca-bundle\") pod \"neutron-57f869f9f6-f2wpq\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.032948 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jjws\" (UniqueName: \"kubernetes.io/projected/061ab736-e68d-4053-b8d3-13ab8220ef22-kube-api-access-2jjws\") pod \"neutron-57f869f9f6-f2wpq\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.033282 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1710156b-5155-4340-8013-2f9e3d68be35","Type":"ContainerStarted","Data":"29dd1459bab2849df92fc5edd0263715bb24232312faeab8175dc1b59df45b49"} Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.035146 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-bc6fbcd4c-xk5sx" podStartSLOduration=3.746756733 podStartE2EDuration="30.035120333s" podCreationTimestamp="2026-02-24 03:12:54 +0000 UTC" firstStartedPulling="2026-02-24 03:12:56.576910125 +0000 UTC m=+1100.593980938" lastFinishedPulling="2026-02-24 03:13:22.865273715 +0000 UTC m=+1126.882344538" observedRunningTime="2026-02-24 03:13:24.002188925 +0000 UTC m=+1128.019259738" watchObservedRunningTime="2026-02-24 03:13:24.035120333 +0000 UTC m=+1128.052191146" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.036200 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575b899b65-wzw2v" event={"ID":"f3e8092d-b9af-4e2f-a5f1-0682e2eff867","Type":"ContainerStarted","Data":"bf1a9459daa3bf4ed550d034c7b35baf3d492a6eb80160b48cf2d0352afe1beb"} Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.036240 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575b899b65-wzw2v" event={"ID":"f3e8092d-b9af-4e2f-a5f1-0682e2eff867","Type":"ContainerStarted","Data":"55caf8825a51ea7fe7ae7f9d456f35a07a433d1138d0f70bd08410515059d34a"} Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.036379 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-575b899b65-wzw2v" podUID="f3e8092d-b9af-4e2f-a5f1-0682e2eff867" containerName="horizon-log" containerID="cri-o://55caf8825a51ea7fe7ae7f9d456f35a07a433d1138d0f70bd08410515059d34a" gracePeriod=30 Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.036497 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-575b899b65-wzw2v" podUID="f3e8092d-b9af-4e2f-a5f1-0682e2eff867" containerName="horizon" containerID="cri-o://bf1a9459daa3bf4ed550d034c7b35baf3d492a6eb80160b48cf2d0352afe1beb" gracePeriod=30 Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.036722 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7fb8677dd-w8wrp" podStartSLOduration=21.036704645 podStartE2EDuration="21.036704645s" podCreationTimestamp="2026-02-24 03:13:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:13:24.036047808 +0000 UTC m=+1128.053118631" watchObservedRunningTime="2026-02-24 03:13:24.036704645 +0000 UTC m=+1128.053775458" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.042029 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-ovndb-tls-certs\") pod \"neutron-57f869f9f6-f2wpq\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.045780 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ff4c\" (UniqueName: \"kubernetes.io/projected/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-kube-api-access-7ff4c\") pod \"dnsmasq-dns-6b7b667979-zxvbv\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.066946 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-ds4gb" podStartSLOduration=4.368978027 podStartE2EDuration="30.066926754s" podCreationTimestamp="2026-02-24 03:12:54 +0000 UTC" firstStartedPulling="2026-02-24 03:12:56.633445531 +0000 UTC m=+1100.650516344" lastFinishedPulling="2026-02-24 03:13:22.331394258 +0000 UTC m=+1126.348465071" observedRunningTime="2026-02-24 03:13:24.056471011 +0000 UTC m=+1128.073541824" watchObservedRunningTime="2026-02-24 03:13:24.066926754 +0000 UTC m=+1128.083997567" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.070096 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.078780 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6dcbd8cd94-497ns" podStartSLOduration=21.078759303 podStartE2EDuration="21.078759303s" podCreationTimestamp="2026-02-24 03:13:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:13:24.073709981 +0000 UTC m=+1128.090780814" watchObservedRunningTime="2026-02-24 03:13:24.078759303 +0000 UTC m=+1128.095830116" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.097880 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-575b899b65-wzw2v" podStartSLOduration=2.931465335 podStartE2EDuration="27.097865721s" podCreationTimestamp="2026-02-24 03:12:57 +0000 UTC" firstStartedPulling="2026-02-24 03:12:58.147250629 +0000 UTC m=+1102.164321442" lastFinishedPulling="2026-02-24 03:13:22.313651015 +0000 UTC m=+1126.330721828" observedRunningTime="2026-02-24 03:13:24.092308156 +0000 UTC m=+1128.109378969" watchObservedRunningTime="2026-02-24 03:13:24.097865721 +0000 UTC m=+1128.114936534" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.102783 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:24 crc kubenswrapper[4923]: I0224 03:13:24.129111 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:25 crc kubenswrapper[4923]: I0224 03:13:25.020092 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-75cfdb654f-g2wdh" podStartSLOduration=3.839024877 podStartE2EDuration="30.020071866s" podCreationTimestamp="2026-02-24 03:12:55 +0000 UTC" firstStartedPulling="2026-02-24 03:12:56.635909235 +0000 UTC m=+1100.652980048" lastFinishedPulling="2026-02-24 03:13:22.816956224 +0000 UTC m=+1126.834027037" observedRunningTime="2026-02-24 03:13:24.122578136 +0000 UTC m=+1128.139648949" watchObservedRunningTime="2026-02-24 03:13:25.020071866 +0000 UTC m=+1129.037142679" Feb 24 03:13:25 crc kubenswrapper[4923]: I0224 03:13:25.021071 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 24 03:13:25 crc kubenswrapper[4923]: I0224 03:13:25.204512 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:13:25 crc kubenswrapper[4923]: I0224 03:13:25.609685 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bqtmp"] Feb 24 03:13:25 crc kubenswrapper[4923]: I0224 03:13:25.628138 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:13:25 crc kubenswrapper[4923]: W0224 03:13:25.642879 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf4b1a04_0555_4f02_b5bc_a8f90141e8dc.slice/crio-ed336c8d60831500be3bd007aff408657fb2e46f41805c71847eb299a1d56890 WatchSource:0}: Error finding container ed336c8d60831500be3bd007aff408657fb2e46f41805c71847eb299a1d56890: Status 404 returned error can't find the container with id ed336c8d60831500be3bd007aff408657fb2e46f41805c71847eb299a1d56890 Feb 24 03:13:25 crc kubenswrapper[4923]: I0224 03:13:25.691368 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-zxvbv"] Feb 24 03:13:25 crc kubenswrapper[4923]: I0224 03:13:25.799439 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57f869f9f6-f2wpq"] Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.047027 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-86d47cfd49-7hzln"] Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.048777 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.053031 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.053534 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-86d47cfd49-7hzln"] Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.054358 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.096884 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" event={"ID":"d2108dcb-bf35-433c-8fc3-49a4e63da0fe","Type":"ContainerStarted","Data":"6b2e48f2b99f01a82bf1a45f4f756808c7a57e105cceab31161fc1282cc82292"} Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.099684 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57f869f9f6-f2wpq" event={"ID":"061ab736-e68d-4053-b8d3-13ab8220ef22","Type":"ContainerStarted","Data":"1e9250594c86112449a3035e9dc00e0dc416ccfc767d46e1a4a5837214a15e0f"} Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.102182 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb08edd9-6041-489a-8713-8bc00d88527c","Type":"ContainerStarted","Data":"8fccc6770b4bd1f1d31a27de5050d7a38775f44d0e2e5ade0fb320a0addbdf11"} Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.102237 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb08edd9-6041-489a-8713-8bc00d88527c","Type":"ContainerStarted","Data":"f53b42741e1f2e4975286faad6425513bea80193822a908fece55138d420cd77"} Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.111591 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c8517902-92e0-4ee1-8765-9d7331ac90f4","Type":"ContainerStarted","Data":"e7089037826ac92663662aa816a1ff80b3e12c7126fa78b451796cd8ed112700"} Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.113721 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bqtmp" event={"ID":"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc","Type":"ContainerStarted","Data":"7af080accce9475d950c8fde299faefe8047b53f7cb4ff3d2122f0717760e5cf"} Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.113784 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bqtmp" event={"ID":"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc","Type":"ContainerStarted","Data":"ed336c8d60831500be3bd007aff408657fb2e46f41805c71847eb299a1d56890"} Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.148004 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bqtmp" podStartSLOduration=3.14798829 podStartE2EDuration="3.14798829s" podCreationTimestamp="2026-02-24 03:13:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:13:26.142740803 +0000 UTC m=+1130.159811616" watchObservedRunningTime="2026-02-24 03:13:26.14798829 +0000 UTC m=+1130.165059103" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.176332 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-internal-tls-certs\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.176415 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-httpd-config\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.176436 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbv92\" (UniqueName: \"kubernetes.io/projected/c2b74cd8-a2f1-4db6-b604-1add73452a54-kube-api-access-pbv92\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.176523 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-combined-ca-bundle\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.176556 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-public-tls-certs\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.176576 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-ovndb-tls-certs\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.176605 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-config\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.279170 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbv92\" (UniqueName: \"kubernetes.io/projected/c2b74cd8-a2f1-4db6-b604-1add73452a54-kube-api-access-pbv92\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.279243 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-combined-ca-bundle\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.279266 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-public-tls-certs\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.279285 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-ovndb-tls-certs\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.279354 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-config\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.279418 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-internal-tls-certs\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.279461 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-httpd-config\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.286736 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-internal-tls-certs\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.288089 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-config\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.296120 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-combined-ca-bundle\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.296596 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-httpd-config\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.297457 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-ovndb-tls-certs\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.300099 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-public-tls-certs\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.300769 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbv92\" (UniqueName: \"kubernetes.io/projected/c2b74cd8-a2f1-4db6-b604-1add73452a54-kube-api-access-pbv92\") pod \"neutron-86d47cfd49-7hzln\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:26 crc kubenswrapper[4923]: I0224 03:13:26.544691 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:27 crc kubenswrapper[4923]: I0224 03:13:27.129893 4923 generic.go:334] "Generic (PLEG): container finished" podID="d2108dcb-bf35-433c-8fc3-49a4e63da0fe" containerID="9be5e10d3fceb9fd20b5323aec018efe9769722e4f8fa9b42d6883c406138f66" exitCode=0 Feb 24 03:13:27 crc kubenswrapper[4923]: I0224 03:13:27.130166 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" event={"ID":"d2108dcb-bf35-433c-8fc3-49a4e63da0fe","Type":"ContainerDied","Data":"9be5e10d3fceb9fd20b5323aec018efe9769722e4f8fa9b42d6883c406138f66"} Feb 24 03:13:27 crc kubenswrapper[4923]: I0224 03:13:27.139703 4923 generic.go:334] "Generic (PLEG): container finished" podID="d0d1f021-7b1a-491b-9dd5-90d6425bcde7" containerID="03e8bda9ede338d2a1a471ee37b9b128d3ab6960e598c9ffbd75a7b0af61f2fd" exitCode=0 Feb 24 03:13:27 crc kubenswrapper[4923]: I0224 03:13:27.139785 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ds4gb" event={"ID":"d0d1f021-7b1a-491b-9dd5-90d6425bcde7","Type":"ContainerDied","Data":"03e8bda9ede338d2a1a471ee37b9b128d3ab6960e598c9ffbd75a7b0af61f2fd"} Feb 24 03:13:27 crc kubenswrapper[4923]: I0224 03:13:27.164118 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57f869f9f6-f2wpq" event={"ID":"061ab736-e68d-4053-b8d3-13ab8220ef22","Type":"ContainerStarted","Data":"be8f1d820b924b4e30acc346afdbdc68223f3cc44f3913c62024f7379f36d1de"} Feb 24 03:13:27 crc kubenswrapper[4923]: I0224 03:13:27.164416 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57f869f9f6-f2wpq" event={"ID":"061ab736-e68d-4053-b8d3-13ab8220ef22","Type":"ContainerStarted","Data":"b34d0a1c313990b17099067d7d4399a6487d36644a99fc842feb91da7815714f"} Feb 24 03:13:27 crc kubenswrapper[4923]: I0224 03:13:27.164433 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:27 crc kubenswrapper[4923]: I0224 03:13:27.179093 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c8517902-92e0-4ee1-8765-9d7331ac90f4","Type":"ContainerStarted","Data":"6d8f953d837de4508355c3dafe77323b8097ca5c24c298a772eaf1d7c0cf69d4"} Feb 24 03:13:27 crc kubenswrapper[4923]: I0224 03:13:27.205266 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-57f869f9f6-f2wpq" podStartSLOduration=4.205250671 podStartE2EDuration="4.205250671s" podCreationTimestamp="2026-02-24 03:13:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:13:27.193884454 +0000 UTC m=+1131.210955267" watchObservedRunningTime="2026-02-24 03:13:27.205250671 +0000 UTC m=+1131.222321484" Feb 24 03:13:27 crc kubenswrapper[4923]: I0224 03:13:27.262200 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.262179197 podStartE2EDuration="7.262179197s" podCreationTimestamp="2026-02-24 03:13:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:13:27.22283248 +0000 UTC m=+1131.239903293" watchObservedRunningTime="2026-02-24 03:13:27.262179197 +0000 UTC m=+1131.279250010" Feb 24 03:13:27 crc kubenswrapper[4923]: I0224 03:13:27.287838 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-86d47cfd49-7hzln"] Feb 24 03:13:27 crc kubenswrapper[4923]: I0224 03:13:27.417076 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.195783 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" event={"ID":"d2108dcb-bf35-433c-8fc3-49a4e63da0fe","Type":"ContainerStarted","Data":"ed57864f95ca8fb86656872b4e3ade157b262bc5e879a44e51bf9610c281373c"} Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.196355 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.201060 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb08edd9-6041-489a-8713-8bc00d88527c","Type":"ContainerStarted","Data":"f93d040957448cccd4cc84ffac2372edd5288de98958e37af6902ced0e9393f4"} Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.215339 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86d47cfd49-7hzln" event={"ID":"c2b74cd8-a2f1-4db6-b604-1add73452a54","Type":"ContainerStarted","Data":"cb7411d44f45defbd1929cfc1eae6d03e59fdb9ab8d24efc07231c5658ea4b54"} Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.215398 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86d47cfd49-7hzln" event={"ID":"c2b74cd8-a2f1-4db6-b604-1add73452a54","Type":"ContainerStarted","Data":"2577b1188f7c0e3f34a30b1a81fe21830f3df1398416983be539a4ae6020abfb"} Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.234178 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" podStartSLOduration=5.23416263 podStartE2EDuration="5.23416263s" podCreationTimestamp="2026-02-24 03:13:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:13:28.21960475 +0000 UTC m=+1132.236675563" watchObservedRunningTime="2026-02-24 03:13:28.23416263 +0000 UTC m=+1132.251233443" Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.248706 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=17.248683929 podStartE2EDuration="17.248683929s" podCreationTimestamp="2026-02-24 03:13:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:13:28.244821248 +0000 UTC m=+1132.261892061" watchObservedRunningTime="2026-02-24 03:13:28.248683929 +0000 UTC m=+1132.265754742" Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.616263 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ds4gb" Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.741541 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-combined-ca-bundle\") pod \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.741631 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-config-data\") pod \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.741663 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-scripts\") pod \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.741689 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6zbc\" (UniqueName: \"kubernetes.io/projected/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-kube-api-access-s6zbc\") pod \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.741805 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-logs\") pod \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\" (UID: \"d0d1f021-7b1a-491b-9dd5-90d6425bcde7\") " Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.742356 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-logs" (OuterVolumeSpecName: "logs") pod "d0d1f021-7b1a-491b-9dd5-90d6425bcde7" (UID: "d0d1f021-7b1a-491b-9dd5-90d6425bcde7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.748536 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-scripts" (OuterVolumeSpecName: "scripts") pod "d0d1f021-7b1a-491b-9dd5-90d6425bcde7" (UID: "d0d1f021-7b1a-491b-9dd5-90d6425bcde7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.762474 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-kube-api-access-s6zbc" (OuterVolumeSpecName: "kube-api-access-s6zbc") pod "d0d1f021-7b1a-491b-9dd5-90d6425bcde7" (UID: "d0d1f021-7b1a-491b-9dd5-90d6425bcde7"). InnerVolumeSpecName "kube-api-access-s6zbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.766863 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-config-data" (OuterVolumeSpecName: "config-data") pod "d0d1f021-7b1a-491b-9dd5-90d6425bcde7" (UID: "d0d1f021-7b1a-491b-9dd5-90d6425bcde7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.772376 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0d1f021-7b1a-491b-9dd5-90d6425bcde7" (UID: "d0d1f021-7b1a-491b-9dd5-90d6425bcde7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.843853 4923 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-logs\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.843885 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.843899 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.843908 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:28 crc kubenswrapper[4923]: I0224 03:13:28.843916 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6zbc\" (UniqueName: \"kubernetes.io/projected/d0d1f021-7b1a-491b-9dd5-90d6425bcde7-kube-api-access-s6zbc\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.223791 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86d47cfd49-7hzln" event={"ID":"c2b74cd8-a2f1-4db6-b604-1add73452a54","Type":"ContainerStarted","Data":"53b8b575df0335247e17f77ae72a9c76ce47979f5ebab2910de76cb5d115a980"} Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.223954 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.225649 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ds4gb" event={"ID":"d0d1f021-7b1a-491b-9dd5-90d6425bcde7","Type":"ContainerDied","Data":"f5b028dd0787b75b9d00df4f6c5fccbbab71674a30d7a99ad29ad0b4dea850bf"} Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.225713 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5b028dd0787b75b9d00df4f6c5fccbbab71674a30d7a99ad29ad0b4dea850bf" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.225908 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ds4gb" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.258050 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-86d47cfd49-7hzln" podStartSLOduration=3.258030948 podStartE2EDuration="3.258030948s" podCreationTimestamp="2026-02-24 03:13:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:13:29.25081268 +0000 UTC m=+1133.267883493" watchObservedRunningTime="2026-02-24 03:13:29.258030948 +0000 UTC m=+1133.275101761" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.272460 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-557784489b-tj6xd"] Feb 24 03:13:29 crc kubenswrapper[4923]: E0224 03:13:29.272995 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0d1f021-7b1a-491b-9dd5-90d6425bcde7" containerName="placement-db-sync" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.273011 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0d1f021-7b1a-491b-9dd5-90d6425bcde7" containerName="placement-db-sync" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.273215 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0d1f021-7b1a-491b-9dd5-90d6425bcde7" containerName="placement-db-sync" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.274085 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.283664 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.283859 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.284142 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-x4cdv" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.285365 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.285500 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.291522 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-557784489b-tj6xd"] Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.366722 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-scripts\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.366767 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-public-tls-certs\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.366996 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxnxq\" (UniqueName: \"kubernetes.io/projected/fe80dcd0-03dd-4361-a639-a995c6f55a08-kube-api-access-lxnxq\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.367062 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-combined-ca-bundle\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.367244 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-config-data\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.367420 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-internal-tls-certs\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.367448 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe80dcd0-03dd-4361-a639-a995c6f55a08-logs\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.469115 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxnxq\" (UniqueName: \"kubernetes.io/projected/fe80dcd0-03dd-4361-a639-a995c6f55a08-kube-api-access-lxnxq\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.469202 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-combined-ca-bundle\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.469879 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-config-data\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.469971 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-internal-tls-certs\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.470014 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe80dcd0-03dd-4361-a639-a995c6f55a08-logs\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.470141 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-scripts\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.470173 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-public-tls-certs\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.470516 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe80dcd0-03dd-4361-a639-a995c6f55a08-logs\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.472873 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-combined-ca-bundle\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.473424 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-internal-tls-certs\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.477566 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-scripts\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.479750 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-public-tls-certs\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.488907 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-config-data\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.494772 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxnxq\" (UniqueName: \"kubernetes.io/projected/fe80dcd0-03dd-4361-a639-a995c6f55a08-kube-api-access-lxnxq\") pod \"placement-557784489b-tj6xd\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:29 crc kubenswrapper[4923]: I0224 03:13:29.587980 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:30 crc kubenswrapper[4923]: I0224 03:13:30.121614 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-557784489b-tj6xd"] Feb 24 03:13:31 crc kubenswrapper[4923]: I0224 03:13:31.243148 4923 generic.go:334] "Generic (PLEG): container finished" podID="cf4b1a04-0555-4f02-b5bc-a8f90141e8dc" containerID="7af080accce9475d950c8fde299faefe8047b53f7cb4ff3d2122f0717760e5cf" exitCode=0 Feb 24 03:13:31 crc kubenswrapper[4923]: I0224 03:13:31.243196 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bqtmp" event={"ID":"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc","Type":"ContainerDied","Data":"7af080accce9475d950c8fde299faefe8047b53f7cb4ff3d2122f0717760e5cf"} Feb 24 03:13:31 crc kubenswrapper[4923]: I0224 03:13:31.313574 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 24 03:13:31 crc kubenswrapper[4923]: I0224 03:13:31.313674 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 24 03:13:31 crc kubenswrapper[4923]: I0224 03:13:31.340166 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 24 03:13:31 crc kubenswrapper[4923]: I0224 03:13:31.352432 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 24 03:13:32 crc kubenswrapper[4923]: I0224 03:13:32.208130 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 24 03:13:32 crc kubenswrapper[4923]: I0224 03:13:32.208473 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 24 03:13:32 crc kubenswrapper[4923]: I0224 03:13:32.252032 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 24 03:13:32 crc kubenswrapper[4923]: I0224 03:13:32.252158 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 24 03:13:32 crc kubenswrapper[4923]: I0224 03:13:32.290982 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 24 03:13:32 crc kubenswrapper[4923]: I0224 03:13:32.298878 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 24 03:13:33 crc kubenswrapper[4923]: I0224 03:13:33.261547 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 24 03:13:33 crc kubenswrapper[4923]: I0224 03:13:33.261820 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 24 03:13:33 crc kubenswrapper[4923]: I0224 03:13:33.767705 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:33 crc kubenswrapper[4923]: I0224 03:13:33.768612 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:33 crc kubenswrapper[4923]: I0224 03:13:33.769657 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7fb8677dd-w8wrp" podUID="260b26fd-552c-4dbb-b181-d423dbd57de2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Feb 24 03:13:33 crc kubenswrapper[4923]: I0224 03:13:33.834879 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:33 crc kubenswrapper[4923]: I0224 03:13:33.834926 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:33 crc kubenswrapper[4923]: I0224 03:13:33.836562 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6dcbd8cd94-497ns" podUID="3cad919b-bb41-4c17-a13a-01831e715fd9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Feb 24 03:13:34 crc kubenswrapper[4923]: I0224 03:13:34.106129 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:34 crc kubenswrapper[4923]: I0224 03:13:34.212998 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-s64ld"] Feb 24 03:13:34 crc kubenswrapper[4923]: I0224 03:13:34.213233 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" podUID="33493b8c-7d7c-4ad5-9b81-12da1ae17aee" containerName="dnsmasq-dns" containerID="cri-o://22d281b97ce0278ea23d502288dd12ed402dd1839acaed7bfa6bbfc2874f3546" gracePeriod=10 Feb 24 03:13:34 crc kubenswrapper[4923]: I0224 03:13:34.949364 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 24 03:13:34 crc kubenswrapper[4923]: I0224 03:13:34.949699 4923 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 24 03:13:34 crc kubenswrapper[4923]: I0224 03:13:34.952900 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 24 03:13:35 crc kubenswrapper[4923]: I0224 03:13:35.287600 4923 generic.go:334] "Generic (PLEG): container finished" podID="33493b8c-7d7c-4ad5-9b81-12da1ae17aee" containerID="22d281b97ce0278ea23d502288dd12ed402dd1839acaed7bfa6bbfc2874f3546" exitCode=0 Feb 24 03:13:35 crc kubenswrapper[4923]: I0224 03:13:35.287701 4923 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 24 03:13:35 crc kubenswrapper[4923]: I0224 03:13:35.287711 4923 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 24 03:13:35 crc kubenswrapper[4923]: I0224 03:13:35.288734 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" event={"ID":"33493b8c-7d7c-4ad5-9b81-12da1ae17aee","Type":"ContainerDied","Data":"22d281b97ce0278ea23d502288dd12ed402dd1839acaed7bfa6bbfc2874f3546"} Feb 24 03:13:35 crc kubenswrapper[4923]: I0224 03:13:35.779472 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" podUID="33493b8c-7d7c-4ad5-9b81-12da1ae17aee" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.145:5353: connect: connection refused" Feb 24 03:13:35 crc kubenswrapper[4923]: I0224 03:13:35.833462 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 24 03:13:36 crc kubenswrapper[4923]: W0224 03:13:36.018269 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe80dcd0_03dd_4361_a639_a995c6f55a08.slice/crio-86cbe5e4ae62d6c0073711e0a19cf065a8364bed46ebb210fce3f08f29a7a604 WatchSource:0}: Error finding container 86cbe5e4ae62d6c0073711e0a19cf065a8364bed46ebb210fce3f08f29a7a604: Status 404 returned error can't find the container with id 86cbe5e4ae62d6c0073711e0a19cf065a8364bed46ebb210fce3f08f29a7a604 Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.081127 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.313905 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.316037 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-557784489b-tj6xd" event={"ID":"fe80dcd0-03dd-4361-a639-a995c6f55a08","Type":"ContainerStarted","Data":"86cbe5e4ae62d6c0073711e0a19cf065a8364bed46ebb210fce3f08f29a7a604"} Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.318739 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bqtmp" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.318986 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bqtmp" event={"ID":"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc","Type":"ContainerDied","Data":"ed336c8d60831500be3bd007aff408657fb2e46f41805c71847eb299a1d56890"} Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.319068 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed336c8d60831500be3bd007aff408657fb2e46f41805c71847eb299a1d56890" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.425854 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-scripts\") pod \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.425904 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sc5v\" (UniqueName: \"kubernetes.io/projected/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-kube-api-access-9sc5v\") pod \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.425925 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-fernet-keys\") pod \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.425965 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-credential-keys\") pod \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.425986 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-combined-ca-bundle\") pod \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.426156 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-config-data\") pod \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\" (UID: \"cf4b1a04-0555-4f02-b5bc-a8f90141e8dc\") " Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.436891 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-scripts" (OuterVolumeSpecName: "scripts") pod "cf4b1a04-0555-4f02-b5bc-a8f90141e8dc" (UID: "cf4b1a04-0555-4f02-b5bc-a8f90141e8dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.437546 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cf4b1a04-0555-4f02-b5bc-a8f90141e8dc" (UID: "cf4b1a04-0555-4f02-b5bc-a8f90141e8dc"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.461493 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-kube-api-access-9sc5v" (OuterVolumeSpecName: "kube-api-access-9sc5v") pod "cf4b1a04-0555-4f02-b5bc-a8f90141e8dc" (UID: "cf4b1a04-0555-4f02-b5bc-a8f90141e8dc"). InnerVolumeSpecName "kube-api-access-9sc5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.462442 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cf4b1a04-0555-4f02-b5bc-a8f90141e8dc" (UID: "cf4b1a04-0555-4f02-b5bc-a8f90141e8dc"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.474441 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf4b1a04-0555-4f02-b5bc-a8f90141e8dc" (UID: "cf4b1a04-0555-4f02-b5bc-a8f90141e8dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.507546 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-config-data" (OuterVolumeSpecName: "config-data") pod "cf4b1a04-0555-4f02-b5bc-a8f90141e8dc" (UID: "cf4b1a04-0555-4f02-b5bc-a8f90141e8dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.515601 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.527772 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.527798 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.527807 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sc5v\" (UniqueName: \"kubernetes.io/projected/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-kube-api-access-9sc5v\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.527819 4923 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.527827 4923 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.527836 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.629247 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-ovsdbserver-sb\") pod \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.629290 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2q8s\" (UniqueName: \"kubernetes.io/projected/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-kube-api-access-f2q8s\") pod \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.629442 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-dns-swift-storage-0\") pod \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.629952 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-config\") pod \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.630013 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-dns-svc\") pod \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.630325 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-ovsdbserver-nb\") pod \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\" (UID: \"33493b8c-7d7c-4ad5-9b81-12da1ae17aee\") " Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.638620 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-kube-api-access-f2q8s" (OuterVolumeSpecName: "kube-api-access-f2q8s") pod "33493b8c-7d7c-4ad5-9b81-12da1ae17aee" (UID: "33493b8c-7d7c-4ad5-9b81-12da1ae17aee"). InnerVolumeSpecName "kube-api-access-f2q8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.691879 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-config" (OuterVolumeSpecName: "config") pod "33493b8c-7d7c-4ad5-9b81-12da1ae17aee" (UID: "33493b8c-7d7c-4ad5-9b81-12da1ae17aee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.692763 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "33493b8c-7d7c-4ad5-9b81-12da1ae17aee" (UID: "33493b8c-7d7c-4ad5-9b81-12da1ae17aee"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.694740 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "33493b8c-7d7c-4ad5-9b81-12da1ae17aee" (UID: "33493b8c-7d7c-4ad5-9b81-12da1ae17aee"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.701248 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "33493b8c-7d7c-4ad5-9b81-12da1ae17aee" (UID: "33493b8c-7d7c-4ad5-9b81-12da1ae17aee"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.703077 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "33493b8c-7d7c-4ad5-9b81-12da1ae17aee" (UID: "33493b8c-7d7c-4ad5-9b81-12da1ae17aee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.732248 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.732307 4923 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.732320 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.732332 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.732344 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2q8s\" (UniqueName: \"kubernetes.io/projected/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-kube-api-access-f2q8s\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:36 crc kubenswrapper[4923]: I0224 03:13:36.732355 4923 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33493b8c-7d7c-4ad5-9b81-12da1ae17aee-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.326498 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rvlxb" event={"ID":"29f64c48-5ed4-431c-8636-702a8abf02b5","Type":"ContainerStarted","Data":"37eb58f3e17d647a959a2ac1115b7933598482b86904200c5c5c2f9e0e7dbb47"} Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.329590 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" event={"ID":"33493b8c-7d7c-4ad5-9b81-12da1ae17aee","Type":"ContainerDied","Data":"f426205667284284d04d2bc425198a06bdefb22ed3c38920065e340bea360cda"} Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.329627 4923 scope.go:117] "RemoveContainer" containerID="22d281b97ce0278ea23d502288dd12ed402dd1839acaed7bfa6bbfc2874f3546" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.329711 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-s64ld" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.334666 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1710156b-5155-4340-8013-2f9e3d68be35","Type":"ContainerStarted","Data":"e2007a68308228bd8cd1ff340880f2836a71c7c61100cae0f99249b1e4bc1974"} Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.337130 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-557784489b-tj6xd" event={"ID":"fe80dcd0-03dd-4361-a639-a995c6f55a08","Type":"ContainerStarted","Data":"656653f429ff4f783e8d509dcb9631e292a9daea43b820b356ac16a1b4157b00"} Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.337156 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-557784489b-tj6xd" event={"ID":"fe80dcd0-03dd-4361-a639-a995c6f55a08","Type":"ContainerStarted","Data":"f47b21d2eb78e0fa5b014dd1fdfcae3eb1646e6a6d51602bfe879c63435b7a7b"} Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.337272 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.350225 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-rvlxb" podStartSLOduration=2.821725111 podStartE2EDuration="42.350203935s" podCreationTimestamp="2026-02-24 03:12:55 +0000 UTC" firstStartedPulling="2026-02-24 03:12:56.682014479 +0000 UTC m=+1100.699085292" lastFinishedPulling="2026-02-24 03:13:36.210493303 +0000 UTC m=+1140.227564116" observedRunningTime="2026-02-24 03:13:37.345569964 +0000 UTC m=+1141.362640767" watchObservedRunningTime="2026-02-24 03:13:37.350203935 +0000 UTC m=+1141.367274748" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.363500 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-s64ld"] Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.368100 4923 scope.go:117] "RemoveContainer" containerID="df75237edd7c81620d9c0ebdc1abe57c3ca54b8c99ac10a55fb50a3bf608f301" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.371636 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-s64ld"] Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.395969 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-557784489b-tj6xd" podStartSLOduration=8.39595475 podStartE2EDuration="8.39595475s" podCreationTimestamp="2026-02-24 03:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:13:37.394101391 +0000 UTC m=+1141.411172214" watchObservedRunningTime="2026-02-24 03:13:37.39595475 +0000 UTC m=+1141.413025563" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.470025 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6cfd87c4f7-w99br"] Feb 24 03:13:37 crc kubenswrapper[4923]: E0224 03:13:37.470397 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33493b8c-7d7c-4ad5-9b81-12da1ae17aee" containerName="init" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.470415 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="33493b8c-7d7c-4ad5-9b81-12da1ae17aee" containerName="init" Feb 24 03:13:37 crc kubenswrapper[4923]: E0224 03:13:37.470435 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4b1a04-0555-4f02-b5bc-a8f90141e8dc" containerName="keystone-bootstrap" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.470442 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4b1a04-0555-4f02-b5bc-a8f90141e8dc" containerName="keystone-bootstrap" Feb 24 03:13:37 crc kubenswrapper[4923]: E0224 03:13:37.470462 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33493b8c-7d7c-4ad5-9b81-12da1ae17aee" containerName="dnsmasq-dns" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.470468 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="33493b8c-7d7c-4ad5-9b81-12da1ae17aee" containerName="dnsmasq-dns" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.470623 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="33493b8c-7d7c-4ad5-9b81-12da1ae17aee" containerName="dnsmasq-dns" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.470654 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf4b1a04-0555-4f02-b5bc-a8f90141e8dc" containerName="keystone-bootstrap" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.471201 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.481602 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6cfd87c4f7-w99br"] Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.482207 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.482347 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.482213 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.482583 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.483181 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.483399 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-26zv5" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.544812 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-fernet-keys\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.544861 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw57v\" (UniqueName: \"kubernetes.io/projected/26c252fe-d59a-4053-946d-b75bea1a9c0b-kube-api-access-gw57v\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.544909 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-credential-keys\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.544949 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-combined-ca-bundle\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.544996 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-config-data\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.545015 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-scripts\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.545046 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-public-tls-certs\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.545065 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-internal-tls-certs\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.646434 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-fernet-keys\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.646488 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw57v\" (UniqueName: \"kubernetes.io/projected/26c252fe-d59a-4053-946d-b75bea1a9c0b-kube-api-access-gw57v\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.646521 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-credential-keys\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.646553 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-combined-ca-bundle\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.646594 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-config-data\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.646612 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-scripts\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.646639 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-public-tls-certs\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.646656 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-internal-tls-certs\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.651062 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.651085 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.651177 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.651366 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.652715 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.661632 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-internal-tls-certs\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.665065 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-credential-keys\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.670328 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-fernet-keys\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.670958 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-public-tls-certs\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.671448 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-combined-ca-bundle\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.671722 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-scripts\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.676980 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw57v\" (UniqueName: \"kubernetes.io/projected/26c252fe-d59a-4053-946d-b75bea1a9c0b-kube-api-access-gw57v\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.681744 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c252fe-d59a-4053-946d-b75bea1a9c0b-config-data\") pod \"keystone-6cfd87c4f7-w99br\" (UID: \"26c252fe-d59a-4053-946d-b75bea1a9c0b\") " pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.729055 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33493b8c-7d7c-4ad5-9b81-12da1ae17aee" path="/var/lib/kubelet/pods/33493b8c-7d7c-4ad5-9b81-12da1ae17aee/volumes" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.798262 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-26zv5" Feb 24 03:13:37 crc kubenswrapper[4923]: I0224 03:13:37.807399 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:38 crc kubenswrapper[4923]: I0224 03:13:38.267804 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6cfd87c4f7-w99br"] Feb 24 03:13:38 crc kubenswrapper[4923]: W0224 03:13:38.308890 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26c252fe_d59a_4053_946d_b75bea1a9c0b.slice/crio-2ffa6a712a0e36a3ffc28762d6eb40f0b098b5e459beeb48e540cc940054505d WatchSource:0}: Error finding container 2ffa6a712a0e36a3ffc28762d6eb40f0b098b5e459beeb48e540cc940054505d: Status 404 returned error can't find the container with id 2ffa6a712a0e36a3ffc28762d6eb40f0b098b5e459beeb48e540cc940054505d Feb 24 03:13:38 crc kubenswrapper[4923]: I0224 03:13:38.363349 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6cfd87c4f7-w99br" event={"ID":"26c252fe-d59a-4053-946d-b75bea1a9c0b","Type":"ContainerStarted","Data":"2ffa6a712a0e36a3ffc28762d6eb40f0b098b5e459beeb48e540cc940054505d"} Feb 24 03:13:38 crc kubenswrapper[4923]: I0224 03:13:38.363397 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-557784489b-tj6xd" Feb 24 03:13:39 crc kubenswrapper[4923]: I0224 03:13:39.373336 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gdl2d" event={"ID":"8ad7cefc-c3bb-48ff-ab05-0fe707823e84","Type":"ContainerStarted","Data":"858d5a53348c622d378ba1c9b0db685104423bf71d15ff285f867b6208fdf657"} Feb 24 03:13:39 crc kubenswrapper[4923]: I0224 03:13:39.377660 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6cfd87c4f7-w99br" event={"ID":"26c252fe-d59a-4053-946d-b75bea1a9c0b","Type":"ContainerStarted","Data":"085518147d7515cc6a27164e8d6342b059619e9169e74eaf8a6c67abfaa0813a"} Feb 24 03:13:39 crc kubenswrapper[4923]: I0224 03:13:39.418523 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6cfd87c4f7-w99br" podStartSLOduration=2.418498338 podStartE2EDuration="2.418498338s" podCreationTimestamp="2026-02-24 03:13:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:13:39.412124111 +0000 UTC m=+1143.429194944" watchObservedRunningTime="2026-02-24 03:13:39.418498338 +0000 UTC m=+1143.435569151" Feb 24 03:13:39 crc kubenswrapper[4923]: I0224 03:13:39.419404 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-gdl2d" podStartSLOduration=3.914109122 podStartE2EDuration="45.419289979s" podCreationTimestamp="2026-02-24 03:12:54 +0000 UTC" firstStartedPulling="2026-02-24 03:12:56.637140257 +0000 UTC m=+1100.654211070" lastFinishedPulling="2026-02-24 03:13:38.142321114 +0000 UTC m=+1142.159391927" observedRunningTime="2026-02-24 03:13:39.394690766 +0000 UTC m=+1143.411761599" watchObservedRunningTime="2026-02-24 03:13:39.419289979 +0000 UTC m=+1143.436360822" Feb 24 03:13:40 crc kubenswrapper[4923]: I0224 03:13:40.397925 4923 generic.go:334] "Generic (PLEG): container finished" podID="29f64c48-5ed4-431c-8636-702a8abf02b5" containerID="37eb58f3e17d647a959a2ac1115b7933598482b86904200c5c5c2f9e0e7dbb47" exitCode=0 Feb 24 03:13:40 crc kubenswrapper[4923]: I0224 03:13:40.398356 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rvlxb" event={"ID":"29f64c48-5ed4-431c-8636-702a8abf02b5","Type":"ContainerDied","Data":"37eb58f3e17d647a959a2ac1115b7933598482b86904200c5c5c2f9e0e7dbb47"} Feb 24 03:13:40 crc kubenswrapper[4923]: I0224 03:13:40.398735 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:13:40 crc kubenswrapper[4923]: I0224 03:13:40.463686 4923 scope.go:117] "RemoveContainer" containerID="bda636fba5dc4e82c5072c2499c1c5799401a557d11f746a59519b1cd63cce71" Feb 24 03:13:40 crc kubenswrapper[4923]: I0224 03:13:40.493747 4923 scope.go:117] "RemoveContainer" containerID="d63fc6ea364b50d66d37162e62942b53cb7bab769e099b84d13d930a1736beba" Feb 24 03:13:43 crc kubenswrapper[4923]: I0224 03:13:43.428725 4923 generic.go:334] "Generic (PLEG): container finished" podID="8ad7cefc-c3bb-48ff-ab05-0fe707823e84" containerID="858d5a53348c622d378ba1c9b0db685104423bf71d15ff285f867b6208fdf657" exitCode=0 Feb 24 03:13:43 crc kubenswrapper[4923]: I0224 03:13:43.428814 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gdl2d" event={"ID":"8ad7cefc-c3bb-48ff-ab05-0fe707823e84","Type":"ContainerDied","Data":"858d5a53348c622d378ba1c9b0db685104423bf71d15ff285f867b6208fdf657"} Feb 24 03:13:43 crc kubenswrapper[4923]: I0224 03:13:43.768554 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7fb8677dd-w8wrp" podUID="260b26fd-552c-4dbb-b181-d423dbd57de2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Feb 24 03:13:43 crc kubenswrapper[4923]: I0224 03:13:43.836023 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6dcbd8cd94-497ns" podUID="3cad919b-bb41-4c17-a13a-01831e715fd9" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Feb 24 03:13:43 crc kubenswrapper[4923]: I0224 03:13:43.905360 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rvlxb" Feb 24 03:13:43 crc kubenswrapper[4923]: I0224 03:13:43.986093 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9skx\" (UniqueName: \"kubernetes.io/projected/29f64c48-5ed4-431c-8636-702a8abf02b5-kube-api-access-f9skx\") pod \"29f64c48-5ed4-431c-8636-702a8abf02b5\" (UID: \"29f64c48-5ed4-431c-8636-702a8abf02b5\") " Feb 24 03:13:43 crc kubenswrapper[4923]: I0224 03:13:43.986234 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29f64c48-5ed4-431c-8636-702a8abf02b5-db-sync-config-data\") pod \"29f64c48-5ed4-431c-8636-702a8abf02b5\" (UID: \"29f64c48-5ed4-431c-8636-702a8abf02b5\") " Feb 24 03:13:43 crc kubenswrapper[4923]: I0224 03:13:43.986424 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f64c48-5ed4-431c-8636-702a8abf02b5-combined-ca-bundle\") pod \"29f64c48-5ed4-431c-8636-702a8abf02b5\" (UID: \"29f64c48-5ed4-431c-8636-702a8abf02b5\") " Feb 24 03:13:44 crc kubenswrapper[4923]: I0224 03:13:44.000692 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29f64c48-5ed4-431c-8636-702a8abf02b5-kube-api-access-f9skx" (OuterVolumeSpecName: "kube-api-access-f9skx") pod "29f64c48-5ed4-431c-8636-702a8abf02b5" (UID: "29f64c48-5ed4-431c-8636-702a8abf02b5"). InnerVolumeSpecName "kube-api-access-f9skx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:13:44 crc kubenswrapper[4923]: I0224 03:13:44.001847 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29f64c48-5ed4-431c-8636-702a8abf02b5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "29f64c48-5ed4-431c-8636-702a8abf02b5" (UID: "29f64c48-5ed4-431c-8636-702a8abf02b5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:44 crc kubenswrapper[4923]: I0224 03:13:44.047931 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29f64c48-5ed4-431c-8636-702a8abf02b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29f64c48-5ed4-431c-8636-702a8abf02b5" (UID: "29f64c48-5ed4-431c-8636-702a8abf02b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:44 crc kubenswrapper[4923]: I0224 03:13:44.089353 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f64c48-5ed4-431c-8636-702a8abf02b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:44 crc kubenswrapper[4923]: I0224 03:13:44.089387 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9skx\" (UniqueName: \"kubernetes.io/projected/29f64c48-5ed4-431c-8636-702a8abf02b5-kube-api-access-f9skx\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:44 crc kubenswrapper[4923]: I0224 03:13:44.089403 4923 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/29f64c48-5ed4-431c-8636-702a8abf02b5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:44 crc kubenswrapper[4923]: I0224 03:13:44.439329 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rvlxb" Feb 24 03:13:44 crc kubenswrapper[4923]: I0224 03:13:44.444368 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rvlxb" event={"ID":"29f64c48-5ed4-431c-8636-702a8abf02b5","Type":"ContainerDied","Data":"c601da1bc4bf317ab6af2fd9015d98846f670fd2e4a1070eae2ef6734e84ab96"} Feb 24 03:13:44 crc kubenswrapper[4923]: I0224 03:13:44.444397 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c601da1bc4bf317ab6af2fd9015d98846f670fd2e4a1070eae2ef6734e84ab96" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.235967 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-895b8674b-v44h4"] Feb 24 03:13:45 crc kubenswrapper[4923]: E0224 03:13:45.236844 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f64c48-5ed4-431c-8636-702a8abf02b5" containerName="barbican-db-sync" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.236933 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f64c48-5ed4-431c-8636-702a8abf02b5" containerName="barbican-db-sync" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.237169 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="29f64c48-5ed4-431c-8636-702a8abf02b5" containerName="barbican-db-sync" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.238119 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-895b8674b-v44h4" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.242745 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.243427 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-hj69t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.243515 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.249605 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7c8bfc6649-mz55h"] Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.251062 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c8bfc6649-mz55h" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.254776 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.257843 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c8bfc6649-mz55h"] Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.266536 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-895b8674b-v44h4"] Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.310890 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-bp25t"] Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.311776 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/774cca46-21ee-41c1-81e7-00c89c26ad37-logs\") pod \"barbican-keystone-listener-895b8674b-v44h4\" (UID: \"774cca46-21ee-41c1-81e7-00c89c26ad37\") " pod="openstack/barbican-keystone-listener-895b8674b-v44h4" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.311823 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cafcd89-7a31-47f2-980b-9b9a6a21bd49-combined-ca-bundle\") pod \"barbican-worker-7c8bfc6649-mz55h\" (UID: \"4cafcd89-7a31-47f2-980b-9b9a6a21bd49\") " pod="openstack/barbican-worker-7c8bfc6649-mz55h" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.311858 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cafcd89-7a31-47f2-980b-9b9a6a21bd49-logs\") pod \"barbican-worker-7c8bfc6649-mz55h\" (UID: \"4cafcd89-7a31-47f2-980b-9b9a6a21bd49\") " pod="openstack/barbican-worker-7c8bfc6649-mz55h" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.311877 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9md4w\" (UniqueName: \"kubernetes.io/projected/774cca46-21ee-41c1-81e7-00c89c26ad37-kube-api-access-9md4w\") pod \"barbican-keystone-listener-895b8674b-v44h4\" (UID: \"774cca46-21ee-41c1-81e7-00c89c26ad37\") " pod="openstack/barbican-keystone-listener-895b8674b-v44h4" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.311921 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/774cca46-21ee-41c1-81e7-00c89c26ad37-config-data-custom\") pod \"barbican-keystone-listener-895b8674b-v44h4\" (UID: \"774cca46-21ee-41c1-81e7-00c89c26ad37\") " pod="openstack/barbican-keystone-listener-895b8674b-v44h4" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.311958 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4cafcd89-7a31-47f2-980b-9b9a6a21bd49-config-data-custom\") pod \"barbican-worker-7c8bfc6649-mz55h\" (UID: \"4cafcd89-7a31-47f2-980b-9b9a6a21bd49\") " pod="openstack/barbican-worker-7c8bfc6649-mz55h" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.311973 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/774cca46-21ee-41c1-81e7-00c89c26ad37-config-data\") pod \"barbican-keystone-listener-895b8674b-v44h4\" (UID: \"774cca46-21ee-41c1-81e7-00c89c26ad37\") " pod="openstack/barbican-keystone-listener-895b8674b-v44h4" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.311992 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4d5n\" (UniqueName: \"kubernetes.io/projected/4cafcd89-7a31-47f2-980b-9b9a6a21bd49-kube-api-access-k4d5n\") pod \"barbican-worker-7c8bfc6649-mz55h\" (UID: \"4cafcd89-7a31-47f2-980b-9b9a6a21bd49\") " pod="openstack/barbican-worker-7c8bfc6649-mz55h" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.312034 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/774cca46-21ee-41c1-81e7-00c89c26ad37-combined-ca-bundle\") pod \"barbican-keystone-listener-895b8674b-v44h4\" (UID: \"774cca46-21ee-41c1-81e7-00c89c26ad37\") " pod="openstack/barbican-keystone-listener-895b8674b-v44h4" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.312074 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cafcd89-7a31-47f2-980b-9b9a6a21bd49-config-data\") pod \"barbican-worker-7c8bfc6649-mz55h\" (UID: \"4cafcd89-7a31-47f2-980b-9b9a6a21bd49\") " pod="openstack/barbican-worker-7c8bfc6649-mz55h" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.312266 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.347615 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-bp25t"] Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.414403 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/774cca46-21ee-41c1-81e7-00c89c26ad37-config-data-custom\") pod \"barbican-keystone-listener-895b8674b-v44h4\" (UID: \"774cca46-21ee-41c1-81e7-00c89c26ad37\") " pod="openstack/barbican-keystone-listener-895b8674b-v44h4" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.414466 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-bp25t\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.414497 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4cafcd89-7a31-47f2-980b-9b9a6a21bd49-config-data-custom\") pod \"barbican-worker-7c8bfc6649-mz55h\" (UID: \"4cafcd89-7a31-47f2-980b-9b9a6a21bd49\") " pod="openstack/barbican-worker-7c8bfc6649-mz55h" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.414518 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/774cca46-21ee-41c1-81e7-00c89c26ad37-config-data\") pod \"barbican-keystone-listener-895b8674b-v44h4\" (UID: \"774cca46-21ee-41c1-81e7-00c89c26ad37\") " pod="openstack/barbican-keystone-listener-895b8674b-v44h4" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.414536 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-bp25t\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.414556 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4d5n\" (UniqueName: \"kubernetes.io/projected/4cafcd89-7a31-47f2-980b-9b9a6a21bd49-kube-api-access-k4d5n\") pod \"barbican-worker-7c8bfc6649-mz55h\" (UID: \"4cafcd89-7a31-47f2-980b-9b9a6a21bd49\") " pod="openstack/barbican-worker-7c8bfc6649-mz55h" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.414818 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/774cca46-21ee-41c1-81e7-00c89c26ad37-combined-ca-bundle\") pod \"barbican-keystone-listener-895b8674b-v44h4\" (UID: \"774cca46-21ee-41c1-81e7-00c89c26ad37\") " pod="openstack/barbican-keystone-listener-895b8674b-v44h4" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.414956 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cafcd89-7a31-47f2-980b-9b9a6a21bd49-config-data\") pod \"barbican-worker-7c8bfc6649-mz55h\" (UID: \"4cafcd89-7a31-47f2-980b-9b9a6a21bd49\") " pod="openstack/barbican-worker-7c8bfc6649-mz55h" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.415044 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/774cca46-21ee-41c1-81e7-00c89c26ad37-logs\") pod \"barbican-keystone-listener-895b8674b-v44h4\" (UID: \"774cca46-21ee-41c1-81e7-00c89c26ad37\") " pod="openstack/barbican-keystone-listener-895b8674b-v44h4" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.415089 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-bp25t\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.415114 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-config\") pod \"dnsmasq-dns-848cf88cfc-bp25t\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.415154 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cafcd89-7a31-47f2-980b-9b9a6a21bd49-combined-ca-bundle\") pod \"barbican-worker-7c8bfc6649-mz55h\" (UID: \"4cafcd89-7a31-47f2-980b-9b9a6a21bd49\") " pod="openstack/barbican-worker-7c8bfc6649-mz55h" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.415200 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wshzb\" (UniqueName: \"kubernetes.io/projected/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-kube-api-access-wshzb\") pod \"dnsmasq-dns-848cf88cfc-bp25t\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.415245 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cafcd89-7a31-47f2-980b-9b9a6a21bd49-logs\") pod \"barbican-worker-7c8bfc6649-mz55h\" (UID: \"4cafcd89-7a31-47f2-980b-9b9a6a21bd49\") " pod="openstack/barbican-worker-7c8bfc6649-mz55h" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.415277 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9md4w\" (UniqueName: \"kubernetes.io/projected/774cca46-21ee-41c1-81e7-00c89c26ad37-kube-api-access-9md4w\") pod \"barbican-keystone-listener-895b8674b-v44h4\" (UID: \"774cca46-21ee-41c1-81e7-00c89c26ad37\") " pod="openstack/barbican-keystone-listener-895b8674b-v44h4" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.415465 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-bp25t\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.415885 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cafcd89-7a31-47f2-980b-9b9a6a21bd49-logs\") pod \"barbican-worker-7c8bfc6649-mz55h\" (UID: \"4cafcd89-7a31-47f2-980b-9b9a6a21bd49\") " pod="openstack/barbican-worker-7c8bfc6649-mz55h" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.415900 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/774cca46-21ee-41c1-81e7-00c89c26ad37-logs\") pod \"barbican-keystone-listener-895b8674b-v44h4\" (UID: \"774cca46-21ee-41c1-81e7-00c89c26ad37\") " pod="openstack/barbican-keystone-listener-895b8674b-v44h4" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.424755 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cafcd89-7a31-47f2-980b-9b9a6a21bd49-combined-ca-bundle\") pod \"barbican-worker-7c8bfc6649-mz55h\" (UID: \"4cafcd89-7a31-47f2-980b-9b9a6a21bd49\") " pod="openstack/barbican-worker-7c8bfc6649-mz55h" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.424968 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4cafcd89-7a31-47f2-980b-9b9a6a21bd49-config-data-custom\") pod \"barbican-worker-7c8bfc6649-mz55h\" (UID: \"4cafcd89-7a31-47f2-980b-9b9a6a21bd49\") " pod="openstack/barbican-worker-7c8bfc6649-mz55h" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.425734 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/774cca46-21ee-41c1-81e7-00c89c26ad37-config-data\") pod \"barbican-keystone-listener-895b8674b-v44h4\" (UID: \"774cca46-21ee-41c1-81e7-00c89c26ad37\") " pod="openstack/barbican-keystone-listener-895b8674b-v44h4" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.426202 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cafcd89-7a31-47f2-980b-9b9a6a21bd49-config-data\") pod \"barbican-worker-7c8bfc6649-mz55h\" (UID: \"4cafcd89-7a31-47f2-980b-9b9a6a21bd49\") " pod="openstack/barbican-worker-7c8bfc6649-mz55h" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.435959 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/774cca46-21ee-41c1-81e7-00c89c26ad37-combined-ca-bundle\") pod \"barbican-keystone-listener-895b8674b-v44h4\" (UID: \"774cca46-21ee-41c1-81e7-00c89c26ad37\") " pod="openstack/barbican-keystone-listener-895b8674b-v44h4" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.438323 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9md4w\" (UniqueName: \"kubernetes.io/projected/774cca46-21ee-41c1-81e7-00c89c26ad37-kube-api-access-9md4w\") pod \"barbican-keystone-listener-895b8674b-v44h4\" (UID: \"774cca46-21ee-41c1-81e7-00c89c26ad37\") " pod="openstack/barbican-keystone-listener-895b8674b-v44h4" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.438446 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/774cca46-21ee-41c1-81e7-00c89c26ad37-config-data-custom\") pod \"barbican-keystone-listener-895b8674b-v44h4\" (UID: \"774cca46-21ee-41c1-81e7-00c89c26ad37\") " pod="openstack/barbican-keystone-listener-895b8674b-v44h4" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.446154 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4d5n\" (UniqueName: \"kubernetes.io/projected/4cafcd89-7a31-47f2-980b-9b9a6a21bd49-kube-api-access-k4d5n\") pod \"barbican-worker-7c8bfc6649-mz55h\" (UID: \"4cafcd89-7a31-47f2-980b-9b9a6a21bd49\") " pod="openstack/barbican-worker-7c8bfc6649-mz55h" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.519490 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-bp25t\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.519566 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-bp25t\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.519601 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-bp25t\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.519699 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-config\") pod \"dnsmasq-dns-848cf88cfc-bp25t\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.519726 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-bp25t\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.519768 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wshzb\" (UniqueName: \"kubernetes.io/projected/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-kube-api-access-wshzb\") pod \"dnsmasq-dns-848cf88cfc-bp25t\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.520719 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-bp25t\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.520769 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-bp25t\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.520863 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-bp25t\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.521110 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-config\") pod \"dnsmasq-dns-848cf88cfc-bp25t\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.521263 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-bp25t\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.541040 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wshzb\" (UniqueName: \"kubernetes.io/projected/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-kube-api-access-wshzb\") pod \"dnsmasq-dns-848cf88cfc-bp25t\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.584260 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-895b8674b-v44h4" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.613803 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c8bfc6649-mz55h" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.640418 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7b857749c4-v7hhc"] Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.641843 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.643714 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.648323 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.661724 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b857749c4-v7hhc"] Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.723522 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28a2368e-86ca-4e7e-a681-291c4b6a0225-logs\") pod \"barbican-api-7b857749c4-v7hhc\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.723788 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfjvf\" (UniqueName: \"kubernetes.io/projected/28a2368e-86ca-4e7e-a681-291c4b6a0225-kube-api-access-pfjvf\") pod \"barbican-api-7b857749c4-v7hhc\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.723809 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a2368e-86ca-4e7e-a681-291c4b6a0225-combined-ca-bundle\") pod \"barbican-api-7b857749c4-v7hhc\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.723826 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28a2368e-86ca-4e7e-a681-291c4b6a0225-config-data-custom\") pod \"barbican-api-7b857749c4-v7hhc\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.724096 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a2368e-86ca-4e7e-a681-291c4b6a0225-config-data\") pod \"barbican-api-7b857749c4-v7hhc\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.825839 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a2368e-86ca-4e7e-a681-291c4b6a0225-config-data\") pod \"barbican-api-7b857749c4-v7hhc\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.826133 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28a2368e-86ca-4e7e-a681-291c4b6a0225-logs\") pod \"barbican-api-7b857749c4-v7hhc\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.826276 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfjvf\" (UniqueName: \"kubernetes.io/projected/28a2368e-86ca-4e7e-a681-291c4b6a0225-kube-api-access-pfjvf\") pod \"barbican-api-7b857749c4-v7hhc\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.826407 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a2368e-86ca-4e7e-a681-291c4b6a0225-combined-ca-bundle\") pod \"barbican-api-7b857749c4-v7hhc\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.826498 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28a2368e-86ca-4e7e-a681-291c4b6a0225-config-data-custom\") pod \"barbican-api-7b857749c4-v7hhc\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.826649 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28a2368e-86ca-4e7e-a681-291c4b6a0225-logs\") pod \"barbican-api-7b857749c4-v7hhc\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.830997 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28a2368e-86ca-4e7e-a681-291c4b6a0225-config-data-custom\") pod \"barbican-api-7b857749c4-v7hhc\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.833702 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a2368e-86ca-4e7e-a681-291c4b6a0225-combined-ca-bundle\") pod \"barbican-api-7b857749c4-v7hhc\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.834829 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a2368e-86ca-4e7e-a681-291c4b6a0225-config-data\") pod \"barbican-api-7b857749c4-v7hhc\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.846376 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfjvf\" (UniqueName: \"kubernetes.io/projected/28a2368e-86ca-4e7e-a681-291c4b6a0225-kube-api-access-pfjvf\") pod \"barbican-api-7b857749c4-v7hhc\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:45 crc kubenswrapper[4923]: I0224 03:13:45.957385 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:46 crc kubenswrapper[4923]: I0224 03:13:46.873993 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:13:46 crc kubenswrapper[4923]: I0224 03:13:46.955079 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-scripts\") pod \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " Feb 24 03:13:46 crc kubenswrapper[4923]: I0224 03:13:46.955440 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js5tl\" (UniqueName: \"kubernetes.io/projected/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-kube-api-access-js5tl\") pod \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " Feb 24 03:13:46 crc kubenswrapper[4923]: I0224 03:13:46.956023 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-config-data\") pod \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " Feb 24 03:13:46 crc kubenswrapper[4923]: I0224 03:13:46.956153 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-db-sync-config-data\") pod \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " Feb 24 03:13:46 crc kubenswrapper[4923]: I0224 03:13:46.956281 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-etc-machine-id\") pod \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " Feb 24 03:13:46 crc kubenswrapper[4923]: I0224 03:13:46.956366 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-combined-ca-bundle\") pod \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\" (UID: \"8ad7cefc-c3bb-48ff-ab05-0fe707823e84\") " Feb 24 03:13:46 crc kubenswrapper[4923]: I0224 03:13:46.960684 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8ad7cefc-c3bb-48ff-ab05-0fe707823e84" (UID: "8ad7cefc-c3bb-48ff-ab05-0fe707823e84"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:13:46 crc kubenswrapper[4923]: I0224 03:13:46.960909 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-scripts" (OuterVolumeSpecName: "scripts") pod "8ad7cefc-c3bb-48ff-ab05-0fe707823e84" (UID: "8ad7cefc-c3bb-48ff-ab05-0fe707823e84"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:46 crc kubenswrapper[4923]: I0224 03:13:46.964673 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8ad7cefc-c3bb-48ff-ab05-0fe707823e84" (UID: "8ad7cefc-c3bb-48ff-ab05-0fe707823e84"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:46 crc kubenswrapper[4923]: I0224 03:13:46.968576 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-kube-api-access-js5tl" (OuterVolumeSpecName: "kube-api-access-js5tl") pod "8ad7cefc-c3bb-48ff-ab05-0fe707823e84" (UID: "8ad7cefc-c3bb-48ff-ab05-0fe707823e84"). InnerVolumeSpecName "kube-api-access-js5tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:13:46 crc kubenswrapper[4923]: E0224 03:13:46.999695 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="1710156b-5155-4340-8013-2f9e3d68be35" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.009071 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ad7cefc-c3bb-48ff-ab05-0fe707823e84" (UID: "8ad7cefc-c3bb-48ff-ab05-0fe707823e84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.035865 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-config-data" (OuterVolumeSpecName: "config-data") pod "8ad7cefc-c3bb-48ff-ab05-0fe707823e84" (UID: "8ad7cefc-c3bb-48ff-ab05-0fe707823e84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.058514 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.058571 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js5tl\" (UniqueName: \"kubernetes.io/projected/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-kube-api-access-js5tl\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.058583 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.058592 4923 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.058600 4923 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.058629 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ad7cefc-c3bb-48ff-ab05-0fe707823e84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.233247 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b857749c4-v7hhc"] Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.355798 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-bp25t"] Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.370926 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-895b8674b-v44h4"] Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.445569 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c8bfc6649-mz55h"] Feb 24 03:13:47 crc kubenswrapper[4923]: W0224 03:13:47.451938 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cafcd89_7a31_47f2_980b_9b9a6a21bd49.slice/crio-8fb745bb96d84689b589ec2a0b59a20893cc2984ce2f9e6762edc3f573648f1e WatchSource:0}: Error finding container 8fb745bb96d84689b589ec2a0b59a20893cc2984ce2f9e6762edc3f573648f1e: Status 404 returned error can't find the container with id 8fb745bb96d84689b589ec2a0b59a20893cc2984ce2f9e6762edc3f573648f1e Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.466103 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gdl2d" event={"ID":"8ad7cefc-c3bb-48ff-ab05-0fe707823e84","Type":"ContainerDied","Data":"047b94f6e9abd8bf2480128f928e3277ef8ecd15674f7d7aa9951c84eeb87bc6"} Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.466134 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gdl2d" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.466144 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="047b94f6e9abd8bf2480128f928e3277ef8ecd15674f7d7aa9951c84eeb87bc6" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.467531 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" event={"ID":"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094","Type":"ContainerStarted","Data":"8c9e9df5e62e86cb0fbee513502d4ff52704ecf634d5d1530f45a5f4059eddd5"} Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.468497 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c8bfc6649-mz55h" event={"ID":"4cafcd89-7a31-47f2-980b-9b9a6a21bd49","Type":"ContainerStarted","Data":"8fb745bb96d84689b589ec2a0b59a20893cc2984ce2f9e6762edc3f573648f1e"} Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.471559 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1710156b-5155-4340-8013-2f9e3d68be35","Type":"ContainerStarted","Data":"27f45824b14505930c0a489a1518e252d3f5b1022a0f212e2b11ee59df34d902"} Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.472687 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1710156b-5155-4340-8013-2f9e3d68be35" containerName="ceilometer-notification-agent" containerID="cri-o://29dd1459bab2849df92fc5edd0263715bb24232312faeab8175dc1b59df45b49" gracePeriod=30 Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.472876 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.473453 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1710156b-5155-4340-8013-2f9e3d68be35" containerName="proxy-httpd" containerID="cri-o://27f45824b14505930c0a489a1518e252d3f5b1022a0f212e2b11ee59df34d902" gracePeriod=30 Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.473530 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1710156b-5155-4340-8013-2f9e3d68be35" containerName="sg-core" containerID="cri-o://e2007a68308228bd8cd1ff340880f2836a71c7c61100cae0f99249b1e4bc1974" gracePeriod=30 Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.502563 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-895b8674b-v44h4" event={"ID":"774cca46-21ee-41c1-81e7-00c89c26ad37","Type":"ContainerStarted","Data":"1ad57c83ff8bcff361085bb655a456d021ae13b6f307ec50cc3caff5721aa824"} Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.520540 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b857749c4-v7hhc" event={"ID":"28a2368e-86ca-4e7e-a681-291c4b6a0225","Type":"ContainerStarted","Data":"48d21d070e24f11f9d5e9a688b4682e271548878d75e0eca9de462b73cf2a45a"} Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.520627 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b857749c4-v7hhc" event={"ID":"28a2368e-86ca-4e7e-a681-291c4b6a0225","Type":"ContainerStarted","Data":"11ee44d28f1b11458b1719deebf032c73fdac8aae529f903a0bc6d80634f699a"} Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.816335 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7d7c54dbbb-xcg2j"] Feb 24 03:13:47 crc kubenswrapper[4923]: E0224 03:13:47.817016 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad7cefc-c3bb-48ff-ab05-0fe707823e84" containerName="cinder-db-sync" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.817029 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad7cefc-c3bb-48ff-ab05-0fe707823e84" containerName="cinder-db-sync" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.817270 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ad7cefc-c3bb-48ff-ab05-0fe707823e84" containerName="cinder-db-sync" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.818147 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.820908 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.821065 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.842609 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d7c54dbbb-xcg2j"] Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.974351 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-internal-tls-certs\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.975483 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-config-data\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.975664 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-logs\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.975692 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-combined-ca-bundle\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.975753 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plnjp\" (UniqueName: \"kubernetes.io/projected/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-kube-api-access-plnjp\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.975816 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-config-data-custom\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:47 crc kubenswrapper[4923]: I0224 03:13:47.975850 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-public-tls-certs\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.077091 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-config-data-custom\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.077155 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-public-tls-certs\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.077202 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-internal-tls-certs\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.077252 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-config-data\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.077319 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-logs\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.077342 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-combined-ca-bundle\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.077397 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plnjp\" (UniqueName: \"kubernetes.io/projected/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-kube-api-access-plnjp\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.078683 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-logs\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.081401 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.082202 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-config-data-custom\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.083154 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.084754 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-public-tls-certs\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.084980 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-config-data\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.089966 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-internal-tls-certs\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.104587 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.122893 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-combined-ca-bundle\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.123012 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mmk8t" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.123053 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.124780 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.125514 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.128880 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plnjp\" (UniqueName: \"kubernetes.io/projected/6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed-kube-api-access-plnjp\") pod \"barbican-api-7d7c54dbbb-xcg2j\" (UID: \"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed\") " pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.150925 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.182384 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-config-data\") pod \"cinder-scheduler-0\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.182453 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.182503 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-scripts\") pod \"cinder-scheduler-0\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.182546 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxfzn\" (UniqueName: \"kubernetes.io/projected/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-kube-api-access-kxfzn\") pod \"cinder-scheduler-0\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.182601 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.182625 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.213940 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-bp25t"] Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.266968 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-7g9px"] Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.268792 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.284411 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-scripts\") pod \"cinder-scheduler-0\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.284479 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxfzn\" (UniqueName: \"kubernetes.io/projected/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-kube-api-access-kxfzn\") pod \"cinder-scheduler-0\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.284537 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.284563 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.284594 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-config-data\") pod \"cinder-scheduler-0\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.284630 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.290468 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.301677 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-7g9px"] Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.306972 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.309690 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-scripts\") pod \"cinder-scheduler-0\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.311959 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.312214 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-config-data\") pod \"cinder-scheduler-0\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.314833 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxfzn\" (UniqueName: \"kubernetes.io/projected/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-kube-api-access-kxfzn\") pod \"cinder-scheduler-0\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.386029 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-7g9px\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.386088 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-dns-svc\") pod \"dnsmasq-dns-6578955fd5-7g9px\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.386112 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-7g9px\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.386166 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzdh4\" (UniqueName: \"kubernetes.io/projected/ec526ce6-1884-41fb-a3f1-070c22309734-kube-api-access-fzdh4\") pod \"dnsmasq-dns-6578955fd5-7g9px\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.386212 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-7g9px\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.386235 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-config\") pod \"dnsmasq-dns-6578955fd5-7g9px\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.407603 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.417631 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.425041 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.455211 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.492540 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-7g9px\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.492581 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-dns-svc\") pod \"dnsmasq-dns-6578955fd5-7g9px\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.492603 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-7g9px\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.492643 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzdh4\" (UniqueName: \"kubernetes.io/projected/ec526ce6-1884-41fb-a3f1-070c22309734-kube-api-access-fzdh4\") pod \"dnsmasq-dns-6578955fd5-7g9px\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.492670 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-7g9px\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.492687 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-config\") pod \"dnsmasq-dns-6578955fd5-7g9px\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.493812 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-config\") pod \"dnsmasq-dns-6578955fd5-7g9px\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.494760 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-7g9px\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.494969 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-7g9px\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.496426 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-7g9px\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.496579 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-dns-svc\") pod \"dnsmasq-dns-6578955fd5-7g9px\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.525572 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzdh4\" (UniqueName: \"kubernetes.io/projected/ec526ce6-1884-41fb-a3f1-070c22309734-kube-api-access-fzdh4\") pod \"dnsmasq-dns-6578955fd5-7g9px\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.550053 4923 generic.go:334] "Generic (PLEG): container finished" podID="1710156b-5155-4340-8013-2f9e3d68be35" containerID="27f45824b14505930c0a489a1518e252d3f5b1022a0f212e2b11ee59df34d902" exitCode=0 Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.550084 4923 generic.go:334] "Generic (PLEG): container finished" podID="1710156b-5155-4340-8013-2f9e3d68be35" containerID="e2007a68308228bd8cd1ff340880f2836a71c7c61100cae0f99249b1e4bc1974" exitCode=2 Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.550139 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1710156b-5155-4340-8013-2f9e3d68be35","Type":"ContainerDied","Data":"27f45824b14505930c0a489a1518e252d3f5b1022a0f212e2b11ee59df34d902"} Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.550167 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1710156b-5155-4340-8013-2f9e3d68be35","Type":"ContainerDied","Data":"e2007a68308228bd8cd1ff340880f2836a71c7c61100cae0f99249b1e4bc1974"} Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.552849 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b857749c4-v7hhc" event={"ID":"28a2368e-86ca-4e7e-a681-291c4b6a0225","Type":"ContainerStarted","Data":"5b61863cf13c708b62925c638604de330b50db6102297c10d82e9fdfe777778f"} Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.554403 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.554803 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.555430 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.581272 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7b857749c4-v7hhc" podStartSLOduration=3.581255883 podStartE2EDuration="3.581255883s" podCreationTimestamp="2026-02-24 03:13:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:13:48.580943945 +0000 UTC m=+1152.598014758" watchObservedRunningTime="2026-02-24 03:13:48.581255883 +0000 UTC m=+1152.598326696" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.583010 4923 generic.go:334] "Generic (PLEG): container finished" podID="ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094" containerID="7bdb9920ec87a03bee636df23aa4e41e066831665635c28fbe2b6359e31f9519" exitCode=0 Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.583051 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" event={"ID":"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094","Type":"ContainerDied","Data":"7bdb9920ec87a03bee636df23aa4e41e066831665635c28fbe2b6359e31f9519"} Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.594344 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-config-data\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.594399 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.594445 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-config-data-custom\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.594459 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.594492 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-logs\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.594512 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-scripts\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.594574 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frznj\" (UniqueName: \"kubernetes.io/projected/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-kube-api-access-frznj\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.652758 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.696415 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-config-data\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.696738 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.696811 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-config-data-custom\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.696826 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.696875 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-logs\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.696904 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-scripts\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.696981 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frznj\" (UniqueName: \"kubernetes.io/projected/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-kube-api-access-frznj\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.704442 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-logs\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.706717 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.716624 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-scripts\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.717065 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.719026 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-config-data-custom\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.719869 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-config-data\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.721881 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frznj\" (UniqueName: \"kubernetes.io/projected/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-kube-api-access-frznj\") pod \"cinder-api-0\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " pod="openstack/cinder-api-0" Feb 24 03:13:48 crc kubenswrapper[4923]: I0224 03:13:48.806798 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.009240 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d7c54dbbb-xcg2j"] Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.246249 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-7g9px"] Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.257832 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.485721 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 24 03:13:49 crc kubenswrapper[4923]: W0224 03:13:49.524814 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46036e9e_c34a_4bef_bf63_55c1c2ac94e1.slice/crio-e7176cef4942aa175fa0eaae9bac53bd35d540ccb43728d2fd45e4140b99d23d WatchSource:0}: Error finding container e7176cef4942aa175fa0eaae9bac53bd35d540ccb43728d2fd45e4140b99d23d: Status 404 returned error can't find the container with id e7176cef4942aa175fa0eaae9bac53bd35d540ccb43728d2fd45e4140b99d23d Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.592489 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"46036e9e-c34a-4bef-bf63-55c1c2ac94e1","Type":"ContainerStarted","Data":"e7176cef4942aa175fa0eaae9bac53bd35d540ccb43728d2fd45e4140b99d23d"} Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.593999 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"78b7cd41-26f8-4681-9b0a-adee15dfd6ec","Type":"ContainerStarted","Data":"469ca1415ace91495986e8cea47f23b0492edc800984d927ee8289434fd11df3"} Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.601015 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" event={"ID":"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094","Type":"ContainerStarted","Data":"b5ec0791c36f8a1d960413dab091f7b72169905f846e66f627030e5f2e12c122"} Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.601182 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" podUID="ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094" containerName="dnsmasq-dns" containerID="cri-o://b5ec0791c36f8a1d960413dab091f7b72169905f846e66f627030e5f2e12c122" gracePeriod=10 Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.602250 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.605270 4923 generic.go:334] "Generic (PLEG): container finished" podID="ec526ce6-1884-41fb-a3f1-070c22309734" containerID="e37c5af9022ba26453fae52f6c25c27668f5470ea7c4070326747c0705bf4131" exitCode=0 Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.605331 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-7g9px" event={"ID":"ec526ce6-1884-41fb-a3f1-070c22309734","Type":"ContainerDied","Data":"e37c5af9022ba26453fae52f6c25c27668f5470ea7c4070326747c0705bf4131"} Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.605352 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-7g9px" event={"ID":"ec526ce6-1884-41fb-a3f1-070c22309734","Type":"ContainerStarted","Data":"09763cd3b326c17c848fd34115a0ba18160cbfcd6b331cbb102088fe279ba6cf"} Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.611659 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d7c54dbbb-xcg2j" event={"ID":"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed","Type":"ContainerStarted","Data":"cc225ef7020e634a7a2c5c0f31fe09490eaf209b4d0bd9d4b07870631583aca7"} Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.611923 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d7c54dbbb-xcg2j" event={"ID":"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed","Type":"ContainerStarted","Data":"9d5beae3f9bede6a8e7af4414bdff499cb1f64fd0099bf72cc1b36f864ead43f"} Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.620804 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" podStartSLOduration=4.620783639 podStartE2EDuration="4.620783639s" podCreationTimestamp="2026-02-24 03:13:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:13:49.616062136 +0000 UTC m=+1153.633132949" watchObservedRunningTime="2026-02-24 03:13:49.620783639 +0000 UTC m=+1153.637854452" Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.916410 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.916471 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.916523 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.917256 4923 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"40cb3d82b93cff9bd3bf829c2417332644f1c7038c262573b0f2c1eba50e9cc2"} pod="openshift-machine-config-operator/machine-config-daemon-rh26t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 03:13:49 crc kubenswrapper[4923]: I0224 03:13:49.917339 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" containerID="cri-o://40cb3d82b93cff9bd3bf829c2417332644f1c7038c262573b0f2c1eba50e9cc2" gracePeriod=600 Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.521281 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.564884 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-ovsdbserver-sb\") pod \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.564986 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-dns-svc\") pod \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.565053 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-ovsdbserver-nb\") pod \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.565092 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-dns-swift-storage-0\") pod \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.623338 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094" (UID: "ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.625226 4923 generic.go:334] "Generic (PLEG): container finished" podID="ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094" containerID="b5ec0791c36f8a1d960413dab091f7b72169905f846e66f627030e5f2e12c122" exitCode=0 Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.625280 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" event={"ID":"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094","Type":"ContainerDied","Data":"b5ec0791c36f8a1d960413dab091f7b72169905f846e66f627030e5f2e12c122"} Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.625402 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" event={"ID":"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094","Type":"ContainerDied","Data":"8c9e9df5e62e86cb0fbee513502d4ff52704ecf634d5d1530f45a5f4059eddd5"} Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.625421 4923 scope.go:117] "RemoveContainer" containerID="b5ec0791c36f8a1d960413dab091f7b72169905f846e66f627030e5f2e12c122" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.625546 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-bp25t" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.629909 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094" (UID: "ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.632090 4923 generic.go:334] "Generic (PLEG): container finished" podID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerID="40cb3d82b93cff9bd3bf829c2417332644f1c7038c262573b0f2c1eba50e9cc2" exitCode=0 Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.632142 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerDied","Data":"40cb3d82b93cff9bd3bf829c2417332644f1c7038c262573b0f2c1eba50e9cc2"} Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.636753 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d7c54dbbb-xcg2j" event={"ID":"6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed","Type":"ContainerStarted","Data":"b2fbe207c64701b665ed88d17562d784557e483431005ebb522c8b0c0c0a1734"} Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.636787 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.636808 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.639339 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"46036e9e-c34a-4bef-bf63-55c1c2ac94e1","Type":"ContainerStarted","Data":"3e66f4b4de2e690e470b2388b67c282030812ff8e91d93a005fa00de2731c7b3"} Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.639666 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094" (UID: "ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.647906 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094" (UID: "ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.657864 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7d7c54dbbb-xcg2j" podStartSLOduration=3.657845372 podStartE2EDuration="3.657845372s" podCreationTimestamp="2026-02-24 03:13:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:13:50.651963428 +0000 UTC m=+1154.669034241" watchObservedRunningTime="2026-02-24 03:13:50.657845372 +0000 UTC m=+1154.674916185" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.666933 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wshzb\" (UniqueName: \"kubernetes.io/projected/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-kube-api-access-wshzb\") pod \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.666991 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-config\") pod \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\" (UID: \"ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094\") " Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.667352 4923 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.667368 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.667378 4923 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.667387 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.671394 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-kube-api-access-wshzb" (OuterVolumeSpecName: "kube-api-access-wshzb") pod "ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094" (UID: "ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094"). InnerVolumeSpecName "kube-api-access-wshzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.712133 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-config" (OuterVolumeSpecName: "config") pod "ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094" (UID: "ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.768524 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wshzb\" (UniqueName: \"kubernetes.io/projected/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-kube-api-access-wshzb\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.768555 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.865144 4923 scope.go:117] "RemoveContainer" containerID="7bdb9920ec87a03bee636df23aa4e41e066831665635c28fbe2b6359e31f9519" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.919058 4923 scope.go:117] "RemoveContainer" containerID="b5ec0791c36f8a1d960413dab091f7b72169905f846e66f627030e5f2e12c122" Feb 24 03:13:50 crc kubenswrapper[4923]: E0224 03:13:50.919655 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5ec0791c36f8a1d960413dab091f7b72169905f846e66f627030e5f2e12c122\": container with ID starting with b5ec0791c36f8a1d960413dab091f7b72169905f846e66f627030e5f2e12c122 not found: ID does not exist" containerID="b5ec0791c36f8a1d960413dab091f7b72169905f846e66f627030e5f2e12c122" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.919684 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ec0791c36f8a1d960413dab091f7b72169905f846e66f627030e5f2e12c122"} err="failed to get container status \"b5ec0791c36f8a1d960413dab091f7b72169905f846e66f627030e5f2e12c122\": rpc error: code = NotFound desc = could not find container \"b5ec0791c36f8a1d960413dab091f7b72169905f846e66f627030e5f2e12c122\": container with ID starting with b5ec0791c36f8a1d960413dab091f7b72169905f846e66f627030e5f2e12c122 not found: ID does not exist" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.919702 4923 scope.go:117] "RemoveContainer" containerID="7bdb9920ec87a03bee636df23aa4e41e066831665635c28fbe2b6359e31f9519" Feb 24 03:13:50 crc kubenswrapper[4923]: E0224 03:13:50.919985 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bdb9920ec87a03bee636df23aa4e41e066831665635c28fbe2b6359e31f9519\": container with ID starting with 7bdb9920ec87a03bee636df23aa4e41e066831665635c28fbe2b6359e31f9519 not found: ID does not exist" containerID="7bdb9920ec87a03bee636df23aa4e41e066831665635c28fbe2b6359e31f9519" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.920008 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bdb9920ec87a03bee636df23aa4e41e066831665635c28fbe2b6359e31f9519"} err="failed to get container status \"7bdb9920ec87a03bee636df23aa4e41e066831665635c28fbe2b6359e31f9519\": rpc error: code = NotFound desc = could not find container \"7bdb9920ec87a03bee636df23aa4e41e066831665635c28fbe2b6359e31f9519\": container with ID starting with 7bdb9920ec87a03bee636df23aa4e41e066831665635c28fbe2b6359e31f9519 not found: ID does not exist" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.920021 4923 scope.go:117] "RemoveContainer" containerID="369652d4e2fcdce7839d154f1d90c85b55a365ec3b7c320fea7e81e6fe472c3d" Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.964910 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-bp25t"] Feb 24 03:13:50 crc kubenswrapper[4923]: I0224 03:13:50.973195 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-bp25t"] Feb 24 03:13:51 crc kubenswrapper[4923]: I0224 03:13:51.741913 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094" path="/var/lib/kubelet/pods/ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094/volumes" Feb 24 03:13:51 crc kubenswrapper[4923]: I0224 03:13:51.743103 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"78b7cd41-26f8-4681-9b0a-adee15dfd6ec","Type":"ContainerStarted","Data":"d43d99a3dfc4f6106f152b2d301cb1f9e7c1e9f6be1676ce1c85469e4069f201"} Feb 24 03:13:51 crc kubenswrapper[4923]: I0224 03:13:51.777531 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerStarted","Data":"84006aadd17b2e131a632622b49eac940374eaac532afbb7829f93e09553d367"} Feb 24 03:13:51 crc kubenswrapper[4923]: I0224 03:13:51.822106 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-7g9px" event={"ID":"ec526ce6-1884-41fb-a3f1-070c22309734","Type":"ContainerStarted","Data":"f114ca67678ea2d1c796aa96edf085f5b978e6d7c16f28713f642f436d9688be"} Feb 24 03:13:51 crc kubenswrapper[4923]: I0224 03:13:51.822802 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:51 crc kubenswrapper[4923]: I0224 03:13:51.826625 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c8bfc6649-mz55h" event={"ID":"4cafcd89-7a31-47f2-980b-9b9a6a21bd49","Type":"ContainerStarted","Data":"870b015fe85c44bcbdc80700a8987112117d64c501dc56dc3f141a580b297194"} Feb 24 03:13:51 crc kubenswrapper[4923]: I0224 03:13:51.826692 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c8bfc6649-mz55h" event={"ID":"4cafcd89-7a31-47f2-980b-9b9a6a21bd49","Type":"ContainerStarted","Data":"8aa53e34ce9012b2ee03dfbb65b50613c59d3a254d4bf42f83e8c5ac94fa63a3"} Feb 24 03:13:51 crc kubenswrapper[4923]: I0224 03:13:51.866661 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"46036e9e-c34a-4bef-bf63-55c1c2ac94e1","Type":"ContainerStarted","Data":"d57ad951148b4e358daf87d157cf34236f0483e47fe7eacfb773e038c118e040"} Feb 24 03:13:51 crc kubenswrapper[4923]: I0224 03:13:51.867831 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 24 03:13:51 crc kubenswrapper[4923]: I0224 03:13:51.926731 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-895b8674b-v44h4" event={"ID":"774cca46-21ee-41c1-81e7-00c89c26ad37","Type":"ContainerStarted","Data":"0b79323d58b8212d53e49678785621c08c905b3a509f85fe6ecb26601a891268"} Feb 24 03:13:51 crc kubenswrapper[4923]: I0224 03:13:51.926770 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-895b8674b-v44h4" event={"ID":"774cca46-21ee-41c1-81e7-00c89c26ad37","Type":"ContainerStarted","Data":"aa90304af58baaa4e816b51397ce579627958049db534b17e5a9e0e60a43f662"} Feb 24 03:13:51 crc kubenswrapper[4923]: I0224 03:13:51.937850 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 24 03:13:51 crc kubenswrapper[4923]: I0224 03:13:51.948097 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-7g9px" podStartSLOduration=3.948068093 podStartE2EDuration="3.948068093s" podCreationTimestamp="2026-02-24 03:13:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:13:51.894468834 +0000 UTC m=+1155.911539647" watchObservedRunningTime="2026-02-24 03:13:51.948068093 +0000 UTC m=+1155.965138906" Feb 24 03:13:51 crc kubenswrapper[4923]: I0224 03:13:51.959756 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.9597364280000003 podStartE2EDuration="3.959736428s" podCreationTimestamp="2026-02-24 03:13:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:13:51.937729064 +0000 UTC m=+1155.954799877" watchObservedRunningTime="2026-02-24 03:13:51.959736428 +0000 UTC m=+1155.976807241" Feb 24 03:13:51 crc kubenswrapper[4923]: I0224 03:13:51.996208 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7c8bfc6649-mz55h" podStartSLOduration=3.541610748 podStartE2EDuration="6.99619245s" podCreationTimestamp="2026-02-24 03:13:45 +0000 UTC" firstStartedPulling="2026-02-24 03:13:47.454090018 +0000 UTC m=+1151.471160831" lastFinishedPulling="2026-02-24 03:13:50.90867173 +0000 UTC m=+1154.925742533" observedRunningTime="2026-02-24 03:13:51.967163562 +0000 UTC m=+1155.984234375" watchObservedRunningTime="2026-02-24 03:13:51.99619245 +0000 UTC m=+1156.013263263" Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.021085 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-895b8674b-v44h4" podStartSLOduration=3.489048245 podStartE2EDuration="7.021066619s" podCreationTimestamp="2026-02-24 03:13:45 +0000 UTC" firstStartedPulling="2026-02-24 03:13:47.376513172 +0000 UTC m=+1151.393583985" lastFinishedPulling="2026-02-24 03:13:50.908531546 +0000 UTC m=+1154.925602359" observedRunningTime="2026-02-24 03:13:51.998773827 +0000 UTC m=+1156.015844640" watchObservedRunningTime="2026-02-24 03:13:52.021066619 +0000 UTC m=+1156.038137432" Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.778995 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.927732 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1710156b-5155-4340-8013-2f9e3d68be35-log-httpd\") pod \"1710156b-5155-4340-8013-2f9e3d68be35\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.927781 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1710156b-5155-4340-8013-2f9e3d68be35-run-httpd\") pod \"1710156b-5155-4340-8013-2f9e3d68be35\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.927828 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-combined-ca-bundle\") pod \"1710156b-5155-4340-8013-2f9e3d68be35\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.927892 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-scripts\") pod \"1710156b-5155-4340-8013-2f9e3d68be35\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.927917 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-sg-core-conf-yaml\") pod \"1710156b-5155-4340-8013-2f9e3d68be35\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.927950 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdwmk\" (UniqueName: \"kubernetes.io/projected/1710156b-5155-4340-8013-2f9e3d68be35-kube-api-access-tdwmk\") pod \"1710156b-5155-4340-8013-2f9e3d68be35\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.928018 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-config-data\") pod \"1710156b-5155-4340-8013-2f9e3d68be35\" (UID: \"1710156b-5155-4340-8013-2f9e3d68be35\") " Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.929964 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1710156b-5155-4340-8013-2f9e3d68be35-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1710156b-5155-4340-8013-2f9e3d68be35" (UID: "1710156b-5155-4340-8013-2f9e3d68be35"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.930247 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1710156b-5155-4340-8013-2f9e3d68be35-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1710156b-5155-4340-8013-2f9e3d68be35" (UID: "1710156b-5155-4340-8013-2f9e3d68be35"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.935461 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-scripts" (OuterVolumeSpecName: "scripts") pod "1710156b-5155-4340-8013-2f9e3d68be35" (UID: "1710156b-5155-4340-8013-2f9e3d68be35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.937668 4923 generic.go:334] "Generic (PLEG): container finished" podID="1710156b-5155-4340-8013-2f9e3d68be35" containerID="29dd1459bab2849df92fc5edd0263715bb24232312faeab8175dc1b59df45b49" exitCode=0 Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.937792 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.937791 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1710156b-5155-4340-8013-2f9e3d68be35","Type":"ContainerDied","Data":"29dd1459bab2849df92fc5edd0263715bb24232312faeab8175dc1b59df45b49"} Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.937840 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1710156b-5155-4340-8013-2f9e3d68be35","Type":"ContainerDied","Data":"69f4fb557fceb5fbc8b66c75a10cb44a33f78eac04c65a379bfd42adb35d6121"} Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.937857 4923 scope.go:117] "RemoveContainer" containerID="27f45824b14505930c0a489a1518e252d3f5b1022a0f212e2b11ee59df34d902" Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.940509 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"78b7cd41-26f8-4681-9b0a-adee15dfd6ec","Type":"ContainerStarted","Data":"7d683f297b3a6d01c0897f9207f85fdb2d04e893eca5c425990991411129266d"} Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.967067 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1710156b-5155-4340-8013-2f9e3d68be35-kube-api-access-tdwmk" (OuterVolumeSpecName: "kube-api-access-tdwmk") pod "1710156b-5155-4340-8013-2f9e3d68be35" (UID: "1710156b-5155-4340-8013-2f9e3d68be35"). InnerVolumeSpecName "kube-api-access-tdwmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.979742 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.311207576 podStartE2EDuration="4.979726674s" podCreationTimestamp="2026-02-24 03:13:48 +0000 UTC" firstStartedPulling="2026-02-24 03:13:49.284048868 +0000 UTC m=+1153.301119681" lastFinishedPulling="2026-02-24 03:13:50.952567966 +0000 UTC m=+1154.969638779" observedRunningTime="2026-02-24 03:13:52.967994498 +0000 UTC m=+1156.985065331" watchObservedRunningTime="2026-02-24 03:13:52.979726674 +0000 UTC m=+1156.996797477" Feb 24 03:13:52 crc kubenswrapper[4923]: I0224 03:13:52.989384 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1710156b-5155-4340-8013-2f9e3d68be35" (UID: "1710156b-5155-4340-8013-2f9e3d68be35"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.010316 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1710156b-5155-4340-8013-2f9e3d68be35" (UID: "1710156b-5155-4340-8013-2f9e3d68be35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.031418 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.031457 4923 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.031469 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdwmk\" (UniqueName: \"kubernetes.io/projected/1710156b-5155-4340-8013-2f9e3d68be35-kube-api-access-tdwmk\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.031479 4923 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1710156b-5155-4340-8013-2f9e3d68be35-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.031488 4923 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1710156b-5155-4340-8013-2f9e3d68be35-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.031496 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.041879 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-config-data" (OuterVolumeSpecName: "config-data") pod "1710156b-5155-4340-8013-2f9e3d68be35" (UID: "1710156b-5155-4340-8013-2f9e3d68be35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.107546 4923 scope.go:117] "RemoveContainer" containerID="e2007a68308228bd8cd1ff340880f2836a71c7c61100cae0f99249b1e4bc1974" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.123927 4923 scope.go:117] "RemoveContainer" containerID="29dd1459bab2849df92fc5edd0263715bb24232312faeab8175dc1b59df45b49" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.133536 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1710156b-5155-4340-8013-2f9e3d68be35-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.143110 4923 scope.go:117] "RemoveContainer" containerID="27f45824b14505930c0a489a1518e252d3f5b1022a0f212e2b11ee59df34d902" Feb 24 03:13:53 crc kubenswrapper[4923]: E0224 03:13:53.143517 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27f45824b14505930c0a489a1518e252d3f5b1022a0f212e2b11ee59df34d902\": container with ID starting with 27f45824b14505930c0a489a1518e252d3f5b1022a0f212e2b11ee59df34d902 not found: ID does not exist" containerID="27f45824b14505930c0a489a1518e252d3f5b1022a0f212e2b11ee59df34d902" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.143559 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f45824b14505930c0a489a1518e252d3f5b1022a0f212e2b11ee59df34d902"} err="failed to get container status \"27f45824b14505930c0a489a1518e252d3f5b1022a0f212e2b11ee59df34d902\": rpc error: code = NotFound desc = could not find container \"27f45824b14505930c0a489a1518e252d3f5b1022a0f212e2b11ee59df34d902\": container with ID starting with 27f45824b14505930c0a489a1518e252d3f5b1022a0f212e2b11ee59df34d902 not found: ID does not exist" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.143584 4923 scope.go:117] "RemoveContainer" containerID="e2007a68308228bd8cd1ff340880f2836a71c7c61100cae0f99249b1e4bc1974" Feb 24 03:13:53 crc kubenswrapper[4923]: E0224 03:13:53.143847 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2007a68308228bd8cd1ff340880f2836a71c7c61100cae0f99249b1e4bc1974\": container with ID starting with e2007a68308228bd8cd1ff340880f2836a71c7c61100cae0f99249b1e4bc1974 not found: ID does not exist" containerID="e2007a68308228bd8cd1ff340880f2836a71c7c61100cae0f99249b1e4bc1974" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.143876 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2007a68308228bd8cd1ff340880f2836a71c7c61100cae0f99249b1e4bc1974"} err="failed to get container status \"e2007a68308228bd8cd1ff340880f2836a71c7c61100cae0f99249b1e4bc1974\": rpc error: code = NotFound desc = could not find container \"e2007a68308228bd8cd1ff340880f2836a71c7c61100cae0f99249b1e4bc1974\": container with ID starting with e2007a68308228bd8cd1ff340880f2836a71c7c61100cae0f99249b1e4bc1974 not found: ID does not exist" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.143897 4923 scope.go:117] "RemoveContainer" containerID="29dd1459bab2849df92fc5edd0263715bb24232312faeab8175dc1b59df45b49" Feb 24 03:13:53 crc kubenswrapper[4923]: E0224 03:13:53.144073 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29dd1459bab2849df92fc5edd0263715bb24232312faeab8175dc1b59df45b49\": container with ID starting with 29dd1459bab2849df92fc5edd0263715bb24232312faeab8175dc1b59df45b49 not found: ID does not exist" containerID="29dd1459bab2849df92fc5edd0263715bb24232312faeab8175dc1b59df45b49" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.144099 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29dd1459bab2849df92fc5edd0263715bb24232312faeab8175dc1b59df45b49"} err="failed to get container status \"29dd1459bab2849df92fc5edd0263715bb24232312faeab8175dc1b59df45b49\": rpc error: code = NotFound desc = could not find container \"29dd1459bab2849df92fc5edd0263715bb24232312faeab8175dc1b59df45b49\": container with ID starting with 29dd1459bab2849df92fc5edd0263715bb24232312faeab8175dc1b59df45b49 not found: ID does not exist" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.287695 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.333504 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.352349 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:13:53 crc kubenswrapper[4923]: E0224 03:13:53.353012 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1710156b-5155-4340-8013-2f9e3d68be35" containerName="proxy-httpd" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.353085 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="1710156b-5155-4340-8013-2f9e3d68be35" containerName="proxy-httpd" Feb 24 03:13:53 crc kubenswrapper[4923]: E0224 03:13:53.353153 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094" containerName="dnsmasq-dns" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.353206 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094" containerName="dnsmasq-dns" Feb 24 03:13:53 crc kubenswrapper[4923]: E0224 03:13:53.353261 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1710156b-5155-4340-8013-2f9e3d68be35" containerName="ceilometer-notification-agent" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.353325 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="1710156b-5155-4340-8013-2f9e3d68be35" containerName="ceilometer-notification-agent" Feb 24 03:13:53 crc kubenswrapper[4923]: E0224 03:13:53.353452 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1710156b-5155-4340-8013-2f9e3d68be35" containerName="sg-core" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.353514 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="1710156b-5155-4340-8013-2f9e3d68be35" containerName="sg-core" Feb 24 03:13:53 crc kubenswrapper[4923]: E0224 03:13:53.353648 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094" containerName="init" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.353700 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094" containerName="init" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.353912 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef0aaf2e-e6d7-4f1e-8d6e-28aa05633094" containerName="dnsmasq-dns" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.353972 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="1710156b-5155-4340-8013-2f9e3d68be35" containerName="ceilometer-notification-agent" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.354031 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="1710156b-5155-4340-8013-2f9e3d68be35" containerName="sg-core" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.354088 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="1710156b-5155-4340-8013-2f9e3d68be35" containerName="proxy-httpd" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.355764 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.359879 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.360377 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.360460 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.541638 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4afbe98-b630-4013-83b8-778ddbeb8b27-log-httpd\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.541718 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.541739 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.541757 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-config-data\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.541775 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-scripts\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.541793 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hpbx\" (UniqueName: \"kubernetes.io/projected/e4afbe98-b630-4013-83b8-778ddbeb8b27-kube-api-access-8hpbx\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.541918 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4afbe98-b630-4013-83b8-778ddbeb8b27-run-httpd\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.559136 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.643885 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4afbe98-b630-4013-83b8-778ddbeb8b27-run-httpd\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.643962 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4afbe98-b630-4013-83b8-778ddbeb8b27-log-httpd\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.644681 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4afbe98-b630-4013-83b8-778ddbeb8b27-log-httpd\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.644699 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4afbe98-b630-4013-83b8-778ddbeb8b27-run-httpd\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.644781 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.644805 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.644827 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-config-data\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.644850 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-scripts\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.644873 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hpbx\" (UniqueName: \"kubernetes.io/projected/e4afbe98-b630-4013-83b8-778ddbeb8b27-kube-api-access-8hpbx\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.651774 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.652604 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.655036 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-config-data\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.666504 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-scripts\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.669200 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hpbx\" (UniqueName: \"kubernetes.io/projected/e4afbe98-b630-4013-83b8-778ddbeb8b27-kube-api-access-8hpbx\") pod \"ceilometer-0\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.690417 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.735565 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1710156b-5155-4340-8013-2f9e3d68be35" path="/var/lib/kubelet/pods/1710156b-5155-4340-8013-2f9e3d68be35/volumes" Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.962651 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="46036e9e-c34a-4bef-bf63-55c1c2ac94e1" containerName="cinder-api-log" containerID="cri-o://3e66f4b4de2e690e470b2388b67c282030812ff8e91d93a005fa00de2731c7b3" gracePeriod=30 Feb 24 03:13:53 crc kubenswrapper[4923]: I0224 03:13:53.963126 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="46036e9e-c34a-4bef-bf63-55c1c2ac94e1" containerName="cinder-api" containerID="cri-o://d57ad951148b4e358daf87d157cf34236f0483e47fe7eacfb773e038c118e040" gracePeriod=30 Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.141725 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.222852 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.394791 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-86d47cfd49-7hzln"] Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.395207 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-86d47cfd49-7hzln" podUID="c2b74cd8-a2f1-4db6-b604-1add73452a54" containerName="neutron-api" containerID="cri-o://cb7411d44f45defbd1929cfc1eae6d03e59fdb9ab8d24efc07231c5658ea4b54" gracePeriod=30 Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.395950 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-86d47cfd49-7hzln" podUID="c2b74cd8-a2f1-4db6-b604-1add73452a54" containerName="neutron-httpd" containerID="cri-o://53b8b575df0335247e17f77ae72a9c76ce47979f5ebab2910de76cb5d115a980" gracePeriod=30 Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.422223 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-86d47cfd49-7hzln" podUID="c2b74cd8-a2f1-4db6-b604-1add73452a54" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": read tcp 10.217.0.2:60836->10.217.0.157:9696: read: connection reset by peer" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.426165 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-55b6b875d5-hmfv4"] Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.427835 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.430899 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55b6b875d5-hmfv4"] Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.476673 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.562480 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-httpd-config\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.562530 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-internal-tls-certs\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.562679 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw8xz\" (UniqueName: \"kubernetes.io/projected/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-kube-api-access-mw8xz\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.562725 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-config\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.562856 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-ovndb-tls-certs\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.562961 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-combined-ca-bundle\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.563021 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-public-tls-certs\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.644031 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.665855 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bea6c18a-0093-4b3b-b56b-323a86181da5-config-data\") pod \"bea6c18a-0093-4b3b-b56b-323a86181da5\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.665944 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bea6c18a-0093-4b3b-b56b-323a86181da5-horizon-secret-key\") pod \"bea6c18a-0093-4b3b-b56b-323a86181da5\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.666022 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bea6c18a-0093-4b3b-b56b-323a86181da5-scripts\") pod \"bea6c18a-0093-4b3b-b56b-323a86181da5\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.666138 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bea6c18a-0093-4b3b-b56b-323a86181da5-logs\") pod \"bea6c18a-0093-4b3b-b56b-323a86181da5\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.666189 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnjnl\" (UniqueName: \"kubernetes.io/projected/bea6c18a-0093-4b3b-b56b-323a86181da5-kube-api-access-xnjnl\") pod \"bea6c18a-0093-4b3b-b56b-323a86181da5\" (UID: \"bea6c18a-0093-4b3b-b56b-323a86181da5\") " Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.666591 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-httpd-config\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.666633 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-internal-tls-certs\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.666702 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw8xz\" (UniqueName: \"kubernetes.io/projected/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-kube-api-access-mw8xz\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.666731 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-config\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.666772 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-ovndb-tls-certs\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.666813 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-combined-ca-bundle\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.666864 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-public-tls-certs\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.671985 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bea6c18a-0093-4b3b-b56b-323a86181da5-logs" (OuterVolumeSpecName: "logs") pod "bea6c18a-0093-4b3b-b56b-323a86181da5" (UID: "bea6c18a-0093-4b3b-b56b-323a86181da5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.672993 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-public-tls-certs\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.681872 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-internal-tls-certs\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.684060 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-combined-ca-bundle\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.684843 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-ovndb-tls-certs\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.687434 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bea6c18a-0093-4b3b-b56b-323a86181da5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "bea6c18a-0093-4b3b-b56b-323a86181da5" (UID: "bea6c18a-0093-4b3b-b56b-323a86181da5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.689057 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-config\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.691496 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bea6c18a-0093-4b3b-b56b-323a86181da5-kube-api-access-xnjnl" (OuterVolumeSpecName: "kube-api-access-xnjnl") pod "bea6c18a-0093-4b3b-b56b-323a86181da5" (UID: "bea6c18a-0093-4b3b-b56b-323a86181da5"). InnerVolumeSpecName "kube-api-access-xnjnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.697014 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-httpd-config\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.710146 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bea6c18a-0093-4b3b-b56b-323a86181da5-config-data" (OuterVolumeSpecName: "config-data") pod "bea6c18a-0093-4b3b-b56b-323a86181da5" (UID: "bea6c18a-0093-4b3b-b56b-323a86181da5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.741474 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw8xz\" (UniqueName: \"kubernetes.io/projected/a13b787e-2ba9-4a5b-96d0-c1d044f4c958-kube-api-access-mw8xz\") pod \"neutron-55b6b875d5-hmfv4\" (UID: \"a13b787e-2ba9-4a5b-96d0-c1d044f4c958\") " pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.741880 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bea6c18a-0093-4b3b-b56b-323a86181da5-scripts" (OuterVolumeSpecName: "scripts") pod "bea6c18a-0093-4b3b-b56b-323a86181da5" (UID: "bea6c18a-0093-4b3b-b56b-323a86181da5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.768341 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/059fc35a-7b75-46b7-86f3-7b05fb19c5de-logs\") pod \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.768599 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvglw\" (UniqueName: \"kubernetes.io/projected/059fc35a-7b75-46b7-86f3-7b05fb19c5de-kube-api-access-xvglw\") pod \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.768921 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/059fc35a-7b75-46b7-86f3-7b05fb19c5de-horizon-secret-key\") pod \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.768966 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/059fc35a-7b75-46b7-86f3-7b05fb19c5de-scripts\") pod \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.769009 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/059fc35a-7b75-46b7-86f3-7b05fb19c5de-config-data\") pod \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\" (UID: \"059fc35a-7b75-46b7-86f3-7b05fb19c5de\") " Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.769375 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bea6c18a-0093-4b3b-b56b-323a86181da5-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.769388 4923 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bea6c18a-0093-4b3b-b56b-323a86181da5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.769398 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bea6c18a-0093-4b3b-b56b-323a86181da5-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.769406 4923 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bea6c18a-0093-4b3b-b56b-323a86181da5-logs\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.769416 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnjnl\" (UniqueName: \"kubernetes.io/projected/bea6c18a-0093-4b3b-b56b-323a86181da5-kube-api-access-xnjnl\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.775903 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/059fc35a-7b75-46b7-86f3-7b05fb19c5de-kube-api-access-xvglw" (OuterVolumeSpecName: "kube-api-access-xvglw") pod "059fc35a-7b75-46b7-86f3-7b05fb19c5de" (UID: "059fc35a-7b75-46b7-86f3-7b05fb19c5de"). InnerVolumeSpecName "kube-api-access-xvglw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.776647 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.776715 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/059fc35a-7b75-46b7-86f3-7b05fb19c5de-logs" (OuterVolumeSpecName: "logs") pod "059fc35a-7b75-46b7-86f3-7b05fb19c5de" (UID: "059fc35a-7b75-46b7-86f3-7b05fb19c5de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.801635 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/059fc35a-7b75-46b7-86f3-7b05fb19c5de-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "059fc35a-7b75-46b7-86f3-7b05fb19c5de" (UID: "059fc35a-7b75-46b7-86f3-7b05fb19c5de"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.828418 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/059fc35a-7b75-46b7-86f3-7b05fb19c5de-scripts" (OuterVolumeSpecName: "scripts") pod "059fc35a-7b75-46b7-86f3-7b05fb19c5de" (UID: "059fc35a-7b75-46b7-86f3-7b05fb19c5de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.830732 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/059fc35a-7b75-46b7-86f3-7b05fb19c5de-config-data" (OuterVolumeSpecName: "config-data") pod "059fc35a-7b75-46b7-86f3-7b05fb19c5de" (UID: "059fc35a-7b75-46b7-86f3-7b05fb19c5de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.871536 4923 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/059fc35a-7b75-46b7-86f3-7b05fb19c5de-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.871563 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/059fc35a-7b75-46b7-86f3-7b05fb19c5de-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.871572 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/059fc35a-7b75-46b7-86f3-7b05fb19c5de-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.871582 4923 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/059fc35a-7b75-46b7-86f3-7b05fb19c5de-logs\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.871590 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvglw\" (UniqueName: \"kubernetes.io/projected/059fc35a-7b75-46b7-86f3-7b05fb19c5de-kube-api-access-xvglw\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.952171 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:13:54 crc kubenswrapper[4923]: I0224 03:13:54.968970 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.001530 4923 generic.go:334] "Generic (PLEG): container finished" podID="c2b74cd8-a2f1-4db6-b604-1add73452a54" containerID="53b8b575df0335247e17f77ae72a9c76ce47979f5ebab2910de76cb5d115a980" exitCode=0 Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.001620 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86d47cfd49-7hzln" event={"ID":"c2b74cd8-a2f1-4db6-b604-1add73452a54","Type":"ContainerDied","Data":"53b8b575df0335247e17f77ae72a9c76ce47979f5ebab2910de76cb5d115a980"} Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.016466 4923 generic.go:334] "Generic (PLEG): container finished" podID="bea6c18a-0093-4b3b-b56b-323a86181da5" containerID="404a6ca207b461006a0893ba7093924758508651a5fced8f73110ec9546137d9" exitCode=137 Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.016491 4923 generic.go:334] "Generic (PLEG): container finished" podID="bea6c18a-0093-4b3b-b56b-323a86181da5" containerID="5ca27549684660b24e8debe6a7249a3eaa72e7749c35ed72e9f3f97a1f9dbe4a" exitCode=137 Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.016540 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bc6fbcd4c-xk5sx" event={"ID":"bea6c18a-0093-4b3b-b56b-323a86181da5","Type":"ContainerDied","Data":"404a6ca207b461006a0893ba7093924758508651a5fced8f73110ec9546137d9"} Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.016567 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bc6fbcd4c-xk5sx" event={"ID":"bea6c18a-0093-4b3b-b56b-323a86181da5","Type":"ContainerDied","Data":"5ca27549684660b24e8debe6a7249a3eaa72e7749c35ed72e9f3f97a1f9dbe4a"} Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.016579 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bc6fbcd4c-xk5sx" event={"ID":"bea6c18a-0093-4b3b-b56b-323a86181da5","Type":"ContainerDied","Data":"adbc3c3b27b2af992f4955fc8858e13898ae934fc89ce290fa1cc73c9efa32fe"} Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.016594 4923 scope.go:117] "RemoveContainer" containerID="404a6ca207b461006a0893ba7093924758508651a5fced8f73110ec9546137d9" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.016719 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bc6fbcd4c-xk5sx" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.037500 4923 generic.go:334] "Generic (PLEG): container finished" podID="059fc35a-7b75-46b7-86f3-7b05fb19c5de" containerID="77fb0ea252c58da0bef8058e4d65911ed0ade30d008df8f49089d4fe1b24883b" exitCode=137 Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.037530 4923 generic.go:334] "Generic (PLEG): container finished" podID="059fc35a-7b75-46b7-86f3-7b05fb19c5de" containerID="e4619b9b56030cf5542cf1786d7a63703cdf7cd96c73f563f5151086476b6bd9" exitCode=137 Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.037580 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75cfdb654f-g2wdh" event={"ID":"059fc35a-7b75-46b7-86f3-7b05fb19c5de","Type":"ContainerDied","Data":"77fb0ea252c58da0bef8058e4d65911ed0ade30d008df8f49089d4fe1b24883b"} Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.037610 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75cfdb654f-g2wdh" event={"ID":"059fc35a-7b75-46b7-86f3-7b05fb19c5de","Type":"ContainerDied","Data":"e4619b9b56030cf5542cf1786d7a63703cdf7cd96c73f563f5151086476b6bd9"} Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.037625 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75cfdb654f-g2wdh" event={"ID":"059fc35a-7b75-46b7-86f3-7b05fb19c5de","Type":"ContainerDied","Data":"6d078c76b86ddbfb1d26b5c70bb353cb25212682c9c4491cad1f547a24ef9f67"} Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.037701 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75cfdb654f-g2wdh" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.043420 4923 generic.go:334] "Generic (PLEG): container finished" podID="46036e9e-c34a-4bef-bf63-55c1c2ac94e1" containerID="d57ad951148b4e358daf87d157cf34236f0483e47fe7eacfb773e038c118e040" exitCode=0 Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.043453 4923 generic.go:334] "Generic (PLEG): container finished" podID="46036e9e-c34a-4bef-bf63-55c1c2ac94e1" containerID="3e66f4b4de2e690e470b2388b67c282030812ff8e91d93a005fa00de2731c7b3" exitCode=143 Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.043500 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"46036e9e-c34a-4bef-bf63-55c1c2ac94e1","Type":"ContainerDied","Data":"d57ad951148b4e358daf87d157cf34236f0483e47fe7eacfb773e038c118e040"} Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.043528 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"46036e9e-c34a-4bef-bf63-55c1c2ac94e1","Type":"ContainerDied","Data":"3e66f4b4de2e690e470b2388b67c282030812ff8e91d93a005fa00de2731c7b3"} Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.043539 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"46036e9e-c34a-4bef-bf63-55c1c2ac94e1","Type":"ContainerDied","Data":"e7176cef4942aa175fa0eaae9bac53bd35d540ccb43728d2fd45e4140b99d23d"} Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.043597 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.054530 4923 generic.go:334] "Generic (PLEG): container finished" podID="f3e8092d-b9af-4e2f-a5f1-0682e2eff867" containerID="bf1a9459daa3bf4ed550d034c7b35baf3d492a6eb80160b48cf2d0352afe1beb" exitCode=137 Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.054741 4923 generic.go:334] "Generic (PLEG): container finished" podID="f3e8092d-b9af-4e2f-a5f1-0682e2eff867" containerID="55caf8825a51ea7fe7ae7f9d456f35a07a433d1138d0f70bd08410515059d34a" exitCode=137 Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.054856 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575b899b65-wzw2v" event={"ID":"f3e8092d-b9af-4e2f-a5f1-0682e2eff867","Type":"ContainerDied","Data":"bf1a9459daa3bf4ed550d034c7b35baf3d492a6eb80160b48cf2d0352afe1beb"} Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.054995 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575b899b65-wzw2v" event={"ID":"f3e8092d-b9af-4e2f-a5f1-0682e2eff867","Type":"ContainerDied","Data":"55caf8825a51ea7fe7ae7f9d456f35a07a433d1138d0f70bd08410515059d34a"} Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.055076 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575b899b65-wzw2v" event={"ID":"f3e8092d-b9af-4e2f-a5f1-0682e2eff867","Type":"ContainerDied","Data":"e003e85df27da7d276da5759ffbd9359a348f7ff905dd51ceaedcdf6cf8fd192"} Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.055592 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-575b899b65-wzw2v" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.069510 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4afbe98-b630-4013-83b8-778ddbeb8b27","Type":"ContainerStarted","Data":"55be6b58f77865c8b5e518a89747df4badd0e6f61fc1c797a33338e93a66f35a"} Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.075535 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-scripts\") pod \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.075571 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-config-data\") pod \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.075590 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-combined-ca-bundle\") pod \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.075606 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-config-data\") pod \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.075642 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frznj\" (UniqueName: \"kubernetes.io/projected/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-kube-api-access-frznj\") pod \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.075676 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-logs\") pod \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.075755 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-config-data-custom\") pod \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.075785 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9c49\" (UniqueName: \"kubernetes.io/projected/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-kube-api-access-z9c49\") pod \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.075817 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-logs\") pod \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.075845 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-etc-machine-id\") pod \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.075875 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-scripts\") pod \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\" (UID: \"46036e9e-c34a-4bef-bf63-55c1c2ac94e1\") " Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.075906 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-horizon-secret-key\") pod \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\" (UID: \"f3e8092d-b9af-4e2f-a5f1-0682e2eff867\") " Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.077768 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-logs" (OuterVolumeSpecName: "logs") pod "46036e9e-c34a-4bef-bf63-55c1c2ac94e1" (UID: "46036e9e-c34a-4bef-bf63-55c1c2ac94e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.079650 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-logs" (OuterVolumeSpecName: "logs") pod "f3e8092d-b9af-4e2f-a5f1-0682e2eff867" (UID: "f3e8092d-b9af-4e2f-a5f1-0682e2eff867"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.080345 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bc6fbcd4c-xk5sx"] Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.082381 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "46036e9e-c34a-4bef-bf63-55c1c2ac94e1" (UID: "46036e9e-c34a-4bef-bf63-55c1c2ac94e1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.092593 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-bc6fbcd4c-xk5sx"] Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.094449 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-scripts" (OuterVolumeSpecName: "scripts") pod "46036e9e-c34a-4bef-bf63-55c1c2ac94e1" (UID: "46036e9e-c34a-4bef-bf63-55c1c2ac94e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.096513 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-kube-api-access-frznj" (OuterVolumeSpecName: "kube-api-access-frznj") pod "46036e9e-c34a-4bef-bf63-55c1c2ac94e1" (UID: "46036e9e-c34a-4bef-bf63-55c1c2ac94e1"). InnerVolumeSpecName "kube-api-access-frznj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.096650 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "46036e9e-c34a-4bef-bf63-55c1c2ac94e1" (UID: "46036e9e-c34a-4bef-bf63-55c1c2ac94e1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.097612 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f3e8092d-b9af-4e2f-a5f1-0682e2eff867" (UID: "f3e8092d-b9af-4e2f-a5f1-0682e2eff867"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.101596 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-kube-api-access-z9c49" (OuterVolumeSpecName: "kube-api-access-z9c49") pod "f3e8092d-b9af-4e2f-a5f1-0682e2eff867" (UID: "f3e8092d-b9af-4e2f-a5f1-0682e2eff867"). InnerVolumeSpecName "kube-api-access-z9c49". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.117509 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75cfdb654f-g2wdh"] Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.119720 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-config-data" (OuterVolumeSpecName: "config-data") pod "f3e8092d-b9af-4e2f-a5f1-0682e2eff867" (UID: "f3e8092d-b9af-4e2f-a5f1-0682e2eff867"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.133062 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-75cfdb654f-g2wdh"] Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.145257 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-scripts" (OuterVolumeSpecName: "scripts") pod "f3e8092d-b9af-4e2f-a5f1-0682e2eff867" (UID: "f3e8092d-b9af-4e2f-a5f1-0682e2eff867"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.147408 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46036e9e-c34a-4bef-bf63-55c1c2ac94e1" (UID: "46036e9e-c34a-4bef-bf63-55c1c2ac94e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.186133 4923 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.186181 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9c49\" (UniqueName: \"kubernetes.io/projected/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-kube-api-access-z9c49\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.186196 4923 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-logs\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.186210 4923 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.186222 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.186236 4923 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.186319 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.186334 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.186345 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3e8092d-b9af-4e2f-a5f1-0682e2eff867-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.186358 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frznj\" (UniqueName: \"kubernetes.io/projected/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-kube-api-access-frznj\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.186370 4923 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-logs\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.193466 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-config-data" (OuterVolumeSpecName: "config-data") pod "46036e9e-c34a-4bef-bf63-55c1c2ac94e1" (UID: "46036e9e-c34a-4bef-bf63-55c1c2ac94e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.273111 4923 scope.go:117] "RemoveContainer" containerID="5ca27549684660b24e8debe6a7249a3eaa72e7749c35ed72e9f3f97a1f9dbe4a" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.289464 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46036e9e-c34a-4bef-bf63-55c1c2ac94e1-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.340197 4923 scope.go:117] "RemoveContainer" containerID="404a6ca207b461006a0893ba7093924758508651a5fced8f73110ec9546137d9" Feb 24 03:13:55 crc kubenswrapper[4923]: E0224 03:13:55.343446 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"404a6ca207b461006a0893ba7093924758508651a5fced8f73110ec9546137d9\": container with ID starting with 404a6ca207b461006a0893ba7093924758508651a5fced8f73110ec9546137d9 not found: ID does not exist" containerID="404a6ca207b461006a0893ba7093924758508651a5fced8f73110ec9546137d9" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.343492 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"404a6ca207b461006a0893ba7093924758508651a5fced8f73110ec9546137d9"} err="failed to get container status \"404a6ca207b461006a0893ba7093924758508651a5fced8f73110ec9546137d9\": rpc error: code = NotFound desc = could not find container \"404a6ca207b461006a0893ba7093924758508651a5fced8f73110ec9546137d9\": container with ID starting with 404a6ca207b461006a0893ba7093924758508651a5fced8f73110ec9546137d9 not found: ID does not exist" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.343523 4923 scope.go:117] "RemoveContainer" containerID="5ca27549684660b24e8debe6a7249a3eaa72e7749c35ed72e9f3f97a1f9dbe4a" Feb 24 03:13:55 crc kubenswrapper[4923]: E0224 03:13:55.345514 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ca27549684660b24e8debe6a7249a3eaa72e7749c35ed72e9f3f97a1f9dbe4a\": container with ID starting with 5ca27549684660b24e8debe6a7249a3eaa72e7749c35ed72e9f3f97a1f9dbe4a not found: ID does not exist" containerID="5ca27549684660b24e8debe6a7249a3eaa72e7749c35ed72e9f3f97a1f9dbe4a" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.345550 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ca27549684660b24e8debe6a7249a3eaa72e7749c35ed72e9f3f97a1f9dbe4a"} err="failed to get container status \"5ca27549684660b24e8debe6a7249a3eaa72e7749c35ed72e9f3f97a1f9dbe4a\": rpc error: code = NotFound desc = could not find container \"5ca27549684660b24e8debe6a7249a3eaa72e7749c35ed72e9f3f97a1f9dbe4a\": container with ID starting with 5ca27549684660b24e8debe6a7249a3eaa72e7749c35ed72e9f3f97a1f9dbe4a not found: ID does not exist" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.345570 4923 scope.go:117] "RemoveContainer" containerID="404a6ca207b461006a0893ba7093924758508651a5fced8f73110ec9546137d9" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.348967 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"404a6ca207b461006a0893ba7093924758508651a5fced8f73110ec9546137d9"} err="failed to get container status \"404a6ca207b461006a0893ba7093924758508651a5fced8f73110ec9546137d9\": rpc error: code = NotFound desc = could not find container \"404a6ca207b461006a0893ba7093924758508651a5fced8f73110ec9546137d9\": container with ID starting with 404a6ca207b461006a0893ba7093924758508651a5fced8f73110ec9546137d9 not found: ID does not exist" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.349014 4923 scope.go:117] "RemoveContainer" containerID="5ca27549684660b24e8debe6a7249a3eaa72e7749c35ed72e9f3f97a1f9dbe4a" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.352778 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ca27549684660b24e8debe6a7249a3eaa72e7749c35ed72e9f3f97a1f9dbe4a"} err="failed to get container status \"5ca27549684660b24e8debe6a7249a3eaa72e7749c35ed72e9f3f97a1f9dbe4a\": rpc error: code = NotFound desc = could not find container \"5ca27549684660b24e8debe6a7249a3eaa72e7749c35ed72e9f3f97a1f9dbe4a\": container with ID starting with 5ca27549684660b24e8debe6a7249a3eaa72e7749c35ed72e9f3f97a1f9dbe4a not found: ID does not exist" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.352821 4923 scope.go:117] "RemoveContainer" containerID="77fb0ea252c58da0bef8058e4d65911ed0ade30d008df8f49089d4fe1b24883b" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.403587 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.416381 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.449402 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 24 03:13:55 crc kubenswrapper[4923]: E0224 03:13:55.450906 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46036e9e-c34a-4bef-bf63-55c1c2ac94e1" containerName="cinder-api-log" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.450941 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="46036e9e-c34a-4bef-bf63-55c1c2ac94e1" containerName="cinder-api-log" Feb 24 03:13:55 crc kubenswrapper[4923]: E0224 03:13:55.450963 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059fc35a-7b75-46b7-86f3-7b05fb19c5de" containerName="horizon-log" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.450970 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="059fc35a-7b75-46b7-86f3-7b05fb19c5de" containerName="horizon-log" Feb 24 03:13:55 crc kubenswrapper[4923]: E0224 03:13:55.450984 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46036e9e-c34a-4bef-bf63-55c1c2ac94e1" containerName="cinder-api" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.450991 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="46036e9e-c34a-4bef-bf63-55c1c2ac94e1" containerName="cinder-api" Feb 24 03:13:55 crc kubenswrapper[4923]: E0224 03:13:55.451009 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bea6c18a-0093-4b3b-b56b-323a86181da5" containerName="horizon" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.451015 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea6c18a-0093-4b3b-b56b-323a86181da5" containerName="horizon" Feb 24 03:13:55 crc kubenswrapper[4923]: E0224 03:13:55.451035 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3e8092d-b9af-4e2f-a5f1-0682e2eff867" containerName="horizon-log" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.451044 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3e8092d-b9af-4e2f-a5f1-0682e2eff867" containerName="horizon-log" Feb 24 03:13:55 crc kubenswrapper[4923]: E0224 03:13:55.451055 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bea6c18a-0093-4b3b-b56b-323a86181da5" containerName="horizon-log" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.451062 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea6c18a-0093-4b3b-b56b-323a86181da5" containerName="horizon-log" Feb 24 03:13:55 crc kubenswrapper[4923]: E0224 03:13:55.451078 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059fc35a-7b75-46b7-86f3-7b05fb19c5de" containerName="horizon" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.451084 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="059fc35a-7b75-46b7-86f3-7b05fb19c5de" containerName="horizon" Feb 24 03:13:55 crc kubenswrapper[4923]: E0224 03:13:55.451093 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3e8092d-b9af-4e2f-a5f1-0682e2eff867" containerName="horizon" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.451099 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3e8092d-b9af-4e2f-a5f1-0682e2eff867" containerName="horizon" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.451515 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="059fc35a-7b75-46b7-86f3-7b05fb19c5de" containerName="horizon-log" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.451541 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="46036e9e-c34a-4bef-bf63-55c1c2ac94e1" containerName="cinder-api" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.451561 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3e8092d-b9af-4e2f-a5f1-0682e2eff867" containerName="horizon" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.451573 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="bea6c18a-0093-4b3b-b56b-323a86181da5" containerName="horizon" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.451594 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="46036e9e-c34a-4bef-bf63-55c1c2ac94e1" containerName="cinder-api-log" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.451615 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="059fc35a-7b75-46b7-86f3-7b05fb19c5de" containerName="horizon" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.451627 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="bea6c18a-0093-4b3b-b56b-323a86181da5" containerName="horizon-log" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.451638 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3e8092d-b9af-4e2f-a5f1-0682e2eff867" containerName="horizon-log" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.453107 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.458756 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.459152 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.460919 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.481452 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-575b899b65-wzw2v"] Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.495364 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.509020 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-575b899b65-wzw2v"] Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.519369 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55b6b875d5-hmfv4"] Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.598747 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e3799b0-c8b2-4204-8b12-62e28dee2c09-logs\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.598877 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e3799b0-c8b2-4204-8b12-62e28dee2c09-config-data-custom\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.598940 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e3799b0-c8b2-4204-8b12-62e28dee2c09-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.598973 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e3799b0-c8b2-4204-8b12-62e28dee2c09-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.599134 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sk8v\" (UniqueName: \"kubernetes.io/projected/4e3799b0-c8b2-4204-8b12-62e28dee2c09-kube-api-access-7sk8v\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.599183 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e3799b0-c8b2-4204-8b12-62e28dee2c09-scripts\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.599206 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e3799b0-c8b2-4204-8b12-62e28dee2c09-config-data\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.599307 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e3799b0-c8b2-4204-8b12-62e28dee2c09-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.599334 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e3799b0-c8b2-4204-8b12-62e28dee2c09-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.600470 4923 scope.go:117] "RemoveContainer" containerID="e4619b9b56030cf5542cf1786d7a63703cdf7cd96c73f563f5151086476b6bd9" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.627486 4923 scope.go:117] "RemoveContainer" containerID="77fb0ea252c58da0bef8058e4d65911ed0ade30d008df8f49089d4fe1b24883b" Feb 24 03:13:55 crc kubenswrapper[4923]: E0224 03:13:55.627987 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77fb0ea252c58da0bef8058e4d65911ed0ade30d008df8f49089d4fe1b24883b\": container with ID starting with 77fb0ea252c58da0bef8058e4d65911ed0ade30d008df8f49089d4fe1b24883b not found: ID does not exist" containerID="77fb0ea252c58da0bef8058e4d65911ed0ade30d008df8f49089d4fe1b24883b" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.628121 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77fb0ea252c58da0bef8058e4d65911ed0ade30d008df8f49089d4fe1b24883b"} err="failed to get container status \"77fb0ea252c58da0bef8058e4d65911ed0ade30d008df8f49089d4fe1b24883b\": rpc error: code = NotFound desc = could not find container \"77fb0ea252c58da0bef8058e4d65911ed0ade30d008df8f49089d4fe1b24883b\": container with ID starting with 77fb0ea252c58da0bef8058e4d65911ed0ade30d008df8f49089d4fe1b24883b not found: ID does not exist" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.628228 4923 scope.go:117] "RemoveContainer" containerID="e4619b9b56030cf5542cf1786d7a63703cdf7cd96c73f563f5151086476b6bd9" Feb 24 03:13:55 crc kubenswrapper[4923]: E0224 03:13:55.628547 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4619b9b56030cf5542cf1786d7a63703cdf7cd96c73f563f5151086476b6bd9\": container with ID starting with e4619b9b56030cf5542cf1786d7a63703cdf7cd96c73f563f5151086476b6bd9 not found: ID does not exist" containerID="e4619b9b56030cf5542cf1786d7a63703cdf7cd96c73f563f5151086476b6bd9" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.628667 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4619b9b56030cf5542cf1786d7a63703cdf7cd96c73f563f5151086476b6bd9"} err="failed to get container status \"e4619b9b56030cf5542cf1786d7a63703cdf7cd96c73f563f5151086476b6bd9\": rpc error: code = NotFound desc = could not find container \"e4619b9b56030cf5542cf1786d7a63703cdf7cd96c73f563f5151086476b6bd9\": container with ID starting with e4619b9b56030cf5542cf1786d7a63703cdf7cd96c73f563f5151086476b6bd9 not found: ID does not exist" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.628750 4923 scope.go:117] "RemoveContainer" containerID="77fb0ea252c58da0bef8058e4d65911ed0ade30d008df8f49089d4fe1b24883b" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.629113 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77fb0ea252c58da0bef8058e4d65911ed0ade30d008df8f49089d4fe1b24883b"} err="failed to get container status \"77fb0ea252c58da0bef8058e4d65911ed0ade30d008df8f49089d4fe1b24883b\": rpc error: code = NotFound desc = could not find container \"77fb0ea252c58da0bef8058e4d65911ed0ade30d008df8f49089d4fe1b24883b\": container with ID starting with 77fb0ea252c58da0bef8058e4d65911ed0ade30d008df8f49089d4fe1b24883b not found: ID does not exist" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.629163 4923 scope.go:117] "RemoveContainer" containerID="e4619b9b56030cf5542cf1786d7a63703cdf7cd96c73f563f5151086476b6bd9" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.633588 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4619b9b56030cf5542cf1786d7a63703cdf7cd96c73f563f5151086476b6bd9"} err="failed to get container status \"e4619b9b56030cf5542cf1786d7a63703cdf7cd96c73f563f5151086476b6bd9\": rpc error: code = NotFound desc = could not find container \"e4619b9b56030cf5542cf1786d7a63703cdf7cd96c73f563f5151086476b6bd9\": container with ID starting with e4619b9b56030cf5542cf1786d7a63703cdf7cd96c73f563f5151086476b6bd9 not found: ID does not exist" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.633655 4923 scope.go:117] "RemoveContainer" containerID="d57ad951148b4e358daf87d157cf34236f0483e47fe7eacfb773e038c118e040" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.658388 4923 scope.go:117] "RemoveContainer" containerID="3e66f4b4de2e690e470b2388b67c282030812ff8e91d93a005fa00de2731c7b3" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.708652 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e3799b0-c8b2-4204-8b12-62e28dee2c09-config-data-custom\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.708713 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e3799b0-c8b2-4204-8b12-62e28dee2c09-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.708738 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e3799b0-c8b2-4204-8b12-62e28dee2c09-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.716420 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e3799b0-c8b2-4204-8b12-62e28dee2c09-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.717648 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sk8v\" (UniqueName: \"kubernetes.io/projected/4e3799b0-c8b2-4204-8b12-62e28dee2c09-kube-api-access-7sk8v\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.718545 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e3799b0-c8b2-4204-8b12-62e28dee2c09-scripts\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.718582 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e3799b0-c8b2-4204-8b12-62e28dee2c09-config-data\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.718651 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e3799b0-c8b2-4204-8b12-62e28dee2c09-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.718685 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e3799b0-c8b2-4204-8b12-62e28dee2c09-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.718753 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e3799b0-c8b2-4204-8b12-62e28dee2c09-logs\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.719144 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e3799b0-c8b2-4204-8b12-62e28dee2c09-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.719242 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e3799b0-c8b2-4204-8b12-62e28dee2c09-logs\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.722319 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e3799b0-c8b2-4204-8b12-62e28dee2c09-config-data\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.728722 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e3799b0-c8b2-4204-8b12-62e28dee2c09-scripts\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.731801 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e3799b0-c8b2-4204-8b12-62e28dee2c09-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.735683 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e3799b0-c8b2-4204-8b12-62e28dee2c09-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.739689 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sk8v\" (UniqueName: \"kubernetes.io/projected/4e3799b0-c8b2-4204-8b12-62e28dee2c09-kube-api-access-7sk8v\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.741344 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e3799b0-c8b2-4204-8b12-62e28dee2c09-config-data-custom\") pod \"cinder-api-0\" (UID: \"4e3799b0-c8b2-4204-8b12-62e28dee2c09\") " pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.748741 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="059fc35a-7b75-46b7-86f3-7b05fb19c5de" path="/var/lib/kubelet/pods/059fc35a-7b75-46b7-86f3-7b05fb19c5de/volumes" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.749346 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46036e9e-c34a-4bef-bf63-55c1c2ac94e1" path="/var/lib/kubelet/pods/46036e9e-c34a-4bef-bf63-55c1c2ac94e1/volumes" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.750072 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bea6c18a-0093-4b3b-b56b-323a86181da5" path="/var/lib/kubelet/pods/bea6c18a-0093-4b3b-b56b-323a86181da5/volumes" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.754871 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3e8092d-b9af-4e2f-a5f1-0682e2eff867" path="/var/lib/kubelet/pods/f3e8092d-b9af-4e2f-a5f1-0682e2eff867/volumes" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.794100 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.957083 4923 scope.go:117] "RemoveContainer" containerID="d57ad951148b4e358daf87d157cf34236f0483e47fe7eacfb773e038c118e040" Feb 24 03:13:55 crc kubenswrapper[4923]: E0224 03:13:55.976678 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d57ad951148b4e358daf87d157cf34236f0483e47fe7eacfb773e038c118e040\": container with ID starting with d57ad951148b4e358daf87d157cf34236f0483e47fe7eacfb773e038c118e040 not found: ID does not exist" containerID="d57ad951148b4e358daf87d157cf34236f0483e47fe7eacfb773e038c118e040" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.976719 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d57ad951148b4e358daf87d157cf34236f0483e47fe7eacfb773e038c118e040"} err="failed to get container status \"d57ad951148b4e358daf87d157cf34236f0483e47fe7eacfb773e038c118e040\": rpc error: code = NotFound desc = could not find container \"d57ad951148b4e358daf87d157cf34236f0483e47fe7eacfb773e038c118e040\": container with ID starting with d57ad951148b4e358daf87d157cf34236f0483e47fe7eacfb773e038c118e040 not found: ID does not exist" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.976742 4923 scope.go:117] "RemoveContainer" containerID="3e66f4b4de2e690e470b2388b67c282030812ff8e91d93a005fa00de2731c7b3" Feb 24 03:13:55 crc kubenswrapper[4923]: E0224 03:13:55.979771 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e66f4b4de2e690e470b2388b67c282030812ff8e91d93a005fa00de2731c7b3\": container with ID starting with 3e66f4b4de2e690e470b2388b67c282030812ff8e91d93a005fa00de2731c7b3 not found: ID does not exist" containerID="3e66f4b4de2e690e470b2388b67c282030812ff8e91d93a005fa00de2731c7b3" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.979813 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e66f4b4de2e690e470b2388b67c282030812ff8e91d93a005fa00de2731c7b3"} err="failed to get container status \"3e66f4b4de2e690e470b2388b67c282030812ff8e91d93a005fa00de2731c7b3\": rpc error: code = NotFound desc = could not find container \"3e66f4b4de2e690e470b2388b67c282030812ff8e91d93a005fa00de2731c7b3\": container with ID starting with 3e66f4b4de2e690e470b2388b67c282030812ff8e91d93a005fa00de2731c7b3 not found: ID does not exist" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.979848 4923 scope.go:117] "RemoveContainer" containerID="d57ad951148b4e358daf87d157cf34236f0483e47fe7eacfb773e038c118e040" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.983578 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d57ad951148b4e358daf87d157cf34236f0483e47fe7eacfb773e038c118e040"} err="failed to get container status \"d57ad951148b4e358daf87d157cf34236f0483e47fe7eacfb773e038c118e040\": rpc error: code = NotFound desc = could not find container \"d57ad951148b4e358daf87d157cf34236f0483e47fe7eacfb773e038c118e040\": container with ID starting with d57ad951148b4e358daf87d157cf34236f0483e47fe7eacfb773e038c118e040 not found: ID does not exist" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.983628 4923 scope.go:117] "RemoveContainer" containerID="3e66f4b4de2e690e470b2388b67c282030812ff8e91d93a005fa00de2731c7b3" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.984009 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e66f4b4de2e690e470b2388b67c282030812ff8e91d93a005fa00de2731c7b3"} err="failed to get container status \"3e66f4b4de2e690e470b2388b67c282030812ff8e91d93a005fa00de2731c7b3\": rpc error: code = NotFound desc = could not find container \"3e66f4b4de2e690e470b2388b67c282030812ff8e91d93a005fa00de2731c7b3\": container with ID starting with 3e66f4b4de2e690e470b2388b67c282030812ff8e91d93a005fa00de2731c7b3 not found: ID does not exist" Feb 24 03:13:55 crc kubenswrapper[4923]: I0224 03:13:55.984031 4923 scope.go:117] "RemoveContainer" containerID="bf1a9459daa3bf4ed550d034c7b35baf3d492a6eb80160b48cf2d0352afe1beb" Feb 24 03:13:56 crc kubenswrapper[4923]: I0224 03:13:56.141871 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4afbe98-b630-4013-83b8-778ddbeb8b27","Type":"ContainerStarted","Data":"c1dfd6f41883eed123c0ecd39d79fe39ad649e65bcf3aa63c4da62a4daf18679"} Feb 24 03:13:56 crc kubenswrapper[4923]: I0224 03:13:56.141913 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4afbe98-b630-4013-83b8-778ddbeb8b27","Type":"ContainerStarted","Data":"5919387f8f34160d6612365685b906a41eb578b59a086e4c64db02f0f442727a"} Feb 24 03:13:56 crc kubenswrapper[4923]: I0224 03:13:56.152459 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b6b875d5-hmfv4" event={"ID":"a13b787e-2ba9-4a5b-96d0-c1d044f4c958","Type":"ContainerStarted","Data":"74fc15f94c12c1a64c5c91e471766eeeb1adaf84221d77398c9437ad6d6ba596"} Feb 24 03:13:56 crc kubenswrapper[4923]: I0224 03:13:56.152505 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b6b875d5-hmfv4" event={"ID":"a13b787e-2ba9-4a5b-96d0-c1d044f4c958","Type":"ContainerStarted","Data":"b197ebaa36c022cbd308dfea36224e7ea427d3edb908255de616209a22047cb0"} Feb 24 03:13:56 crc kubenswrapper[4923]: I0224 03:13:56.216758 4923 scope.go:117] "RemoveContainer" containerID="55caf8825a51ea7fe7ae7f9d456f35a07a433d1138d0f70bd08410515059d34a" Feb 24 03:13:56 crc kubenswrapper[4923]: I0224 03:13:56.254999 4923 scope.go:117] "RemoveContainer" containerID="bf1a9459daa3bf4ed550d034c7b35baf3d492a6eb80160b48cf2d0352afe1beb" Feb 24 03:13:56 crc kubenswrapper[4923]: E0224 03:13:56.258118 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf1a9459daa3bf4ed550d034c7b35baf3d492a6eb80160b48cf2d0352afe1beb\": container with ID starting with bf1a9459daa3bf4ed550d034c7b35baf3d492a6eb80160b48cf2d0352afe1beb not found: ID does not exist" containerID="bf1a9459daa3bf4ed550d034c7b35baf3d492a6eb80160b48cf2d0352afe1beb" Feb 24 03:13:56 crc kubenswrapper[4923]: I0224 03:13:56.258232 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf1a9459daa3bf4ed550d034c7b35baf3d492a6eb80160b48cf2d0352afe1beb"} err="failed to get container status \"bf1a9459daa3bf4ed550d034c7b35baf3d492a6eb80160b48cf2d0352afe1beb\": rpc error: code = NotFound desc = could not find container \"bf1a9459daa3bf4ed550d034c7b35baf3d492a6eb80160b48cf2d0352afe1beb\": container with ID starting with bf1a9459daa3bf4ed550d034c7b35baf3d492a6eb80160b48cf2d0352afe1beb not found: ID does not exist" Feb 24 03:13:56 crc kubenswrapper[4923]: I0224 03:13:56.258340 4923 scope.go:117] "RemoveContainer" containerID="55caf8825a51ea7fe7ae7f9d456f35a07a433d1138d0f70bd08410515059d34a" Feb 24 03:13:56 crc kubenswrapper[4923]: E0224 03:13:56.260085 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55caf8825a51ea7fe7ae7f9d456f35a07a433d1138d0f70bd08410515059d34a\": container with ID starting with 55caf8825a51ea7fe7ae7f9d456f35a07a433d1138d0f70bd08410515059d34a not found: ID does not exist" containerID="55caf8825a51ea7fe7ae7f9d456f35a07a433d1138d0f70bd08410515059d34a" Feb 24 03:13:56 crc kubenswrapper[4923]: I0224 03:13:56.260188 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55caf8825a51ea7fe7ae7f9d456f35a07a433d1138d0f70bd08410515059d34a"} err="failed to get container status \"55caf8825a51ea7fe7ae7f9d456f35a07a433d1138d0f70bd08410515059d34a\": rpc error: code = NotFound desc = could not find container \"55caf8825a51ea7fe7ae7f9d456f35a07a433d1138d0f70bd08410515059d34a\": container with ID starting with 55caf8825a51ea7fe7ae7f9d456f35a07a433d1138d0f70bd08410515059d34a not found: ID does not exist" Feb 24 03:13:56 crc kubenswrapper[4923]: I0224 03:13:56.260306 4923 scope.go:117] "RemoveContainer" containerID="bf1a9459daa3bf4ed550d034c7b35baf3d492a6eb80160b48cf2d0352afe1beb" Feb 24 03:13:56 crc kubenswrapper[4923]: I0224 03:13:56.264283 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf1a9459daa3bf4ed550d034c7b35baf3d492a6eb80160b48cf2d0352afe1beb"} err="failed to get container status \"bf1a9459daa3bf4ed550d034c7b35baf3d492a6eb80160b48cf2d0352afe1beb\": rpc error: code = NotFound desc = could not find container \"bf1a9459daa3bf4ed550d034c7b35baf3d492a6eb80160b48cf2d0352afe1beb\": container with ID starting with bf1a9459daa3bf4ed550d034c7b35baf3d492a6eb80160b48cf2d0352afe1beb not found: ID does not exist" Feb 24 03:13:56 crc kubenswrapper[4923]: I0224 03:13:56.264459 4923 scope.go:117] "RemoveContainer" containerID="55caf8825a51ea7fe7ae7f9d456f35a07a433d1138d0f70bd08410515059d34a" Feb 24 03:13:56 crc kubenswrapper[4923]: I0224 03:13:56.264739 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55caf8825a51ea7fe7ae7f9d456f35a07a433d1138d0f70bd08410515059d34a"} err="failed to get container status \"55caf8825a51ea7fe7ae7f9d456f35a07a433d1138d0f70bd08410515059d34a\": rpc error: code = NotFound desc = could not find container \"55caf8825a51ea7fe7ae7f9d456f35a07a433d1138d0f70bd08410515059d34a\": container with ID starting with 55caf8825a51ea7fe7ae7f9d456f35a07a433d1138d0f70bd08410515059d34a not found: ID does not exist" Feb 24 03:13:56 crc kubenswrapper[4923]: I0224 03:13:56.298891 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 24 03:13:56 crc kubenswrapper[4923]: I0224 03:13:56.508920 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:56 crc kubenswrapper[4923]: I0224 03:13:56.545938 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-86d47cfd49-7hzln" podUID="c2b74cd8-a2f1-4db6-b604-1add73452a54" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": dial tcp 10.217.0.157:9696: connect: connection refused" Feb 24 03:13:56 crc kubenswrapper[4923]: I0224 03:13:56.991086 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:57 crc kubenswrapper[4923]: I0224 03:13:57.188631 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b6b875d5-hmfv4" event={"ID":"a13b787e-2ba9-4a5b-96d0-c1d044f4c958","Type":"ContainerStarted","Data":"340542b4b611c59b8f087ad5df3097049f4d914b41719459c37f9e887499ea65"} Feb 24 03:13:57 crc kubenswrapper[4923]: I0224 03:13:57.189231 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:13:57 crc kubenswrapper[4923]: I0224 03:13:57.212877 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4afbe98-b630-4013-83b8-778ddbeb8b27","Type":"ContainerStarted","Data":"1f1bc218cb4d79edfad54a297ed32f7856e72c8f62400a264696b02673014592"} Feb 24 03:13:57 crc kubenswrapper[4923]: I0224 03:13:57.217857 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-55b6b875d5-hmfv4" podStartSLOduration=3.217840741 podStartE2EDuration="3.217840741s" podCreationTimestamp="2026-02-24 03:13:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:13:57.206635578 +0000 UTC m=+1161.223706391" watchObservedRunningTime="2026-02-24 03:13:57.217840741 +0000 UTC m=+1161.234911554" Feb 24 03:13:57 crc kubenswrapper[4923]: I0224 03:13:57.219664 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4e3799b0-c8b2-4204-8b12-62e28dee2c09","Type":"ContainerStarted","Data":"c973ca9213a803d52494b26c59238be4f3e913b2c9fb00acf179d0e24aae05ac"} Feb 24 03:13:57 crc kubenswrapper[4923]: I0224 03:13:57.219695 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4e3799b0-c8b2-4204-8b12-62e28dee2c09","Type":"ContainerStarted","Data":"ffd3b1ed1c1fa2c86ec3ad0e672e36482638e017c7826895c10a1ded7e5eae9e"} Feb 24 03:13:57 crc kubenswrapper[4923]: I0224 03:13:57.555851 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:57 crc kubenswrapper[4923]: I0224 03:13:57.987163 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:13:58 crc kubenswrapper[4923]: I0224 03:13:58.235650 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4e3799b0-c8b2-4204-8b12-62e28dee2c09","Type":"ContainerStarted","Data":"0c7c85493544ee1c082d17b501cdc6aaabb98043fa1535c01e6ef94f0abb7273"} Feb 24 03:13:58 crc kubenswrapper[4923]: I0224 03:13:58.260611 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.260595622 podStartE2EDuration="3.260595622s" podCreationTimestamp="2026-02-24 03:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:13:58.259139954 +0000 UTC m=+1162.276210767" watchObservedRunningTime="2026-02-24 03:13:58.260595622 +0000 UTC m=+1162.277666435" Feb 24 03:13:58 crc kubenswrapper[4923]: I0224 03:13:58.660362 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:13:58 crc kubenswrapper[4923]: I0224 03:13:58.767137 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-zxvbv"] Feb 24 03:13:58 crc kubenswrapper[4923]: I0224 03:13:58.767377 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" podUID="d2108dcb-bf35-433c-8fc3-49a4e63da0fe" containerName="dnsmasq-dns" containerID="cri-o://ed57864f95ca8fb86656872b4e3ade157b262bc5e879a44e51bf9610c281373c" gracePeriod=10 Feb 24 03:13:58 crc kubenswrapper[4923]: I0224 03:13:58.898722 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:13:58 crc kubenswrapper[4923]: I0224 03:13:58.992689 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.047607 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.285643 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4afbe98-b630-4013-83b8-778ddbeb8b27","Type":"ContainerStarted","Data":"8298593c5109efd36f80cd6f37e412910893e60ae562106e275e501ab8ef2283"} Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.286410 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.297969 4923 generic.go:334] "Generic (PLEG): container finished" podID="d2108dcb-bf35-433c-8fc3-49a4e63da0fe" containerID="ed57864f95ca8fb86656872b4e3ade157b262bc5e879a44e51bf9610c281373c" exitCode=0 Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.298160 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="78b7cd41-26f8-4681-9b0a-adee15dfd6ec" containerName="cinder-scheduler" containerID="cri-o://d43d99a3dfc4f6106f152b2d301cb1f9e7c1e9f6be1676ce1c85469e4069f201" gracePeriod=30 Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.298420 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" event={"ID":"d2108dcb-bf35-433c-8fc3-49a4e63da0fe","Type":"ContainerDied","Data":"ed57864f95ca8fb86656872b4e3ade157b262bc5e879a44e51bf9610c281373c"} Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.298467 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="78b7cd41-26f8-4681-9b0a-adee15dfd6ec" containerName="probe" containerID="cri-o://7d683f297b3a6d01c0897f9207f85fdb2d04e893eca5c425990991411129266d" gracePeriod=30 Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.299279 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.314570 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.873634255 podStartE2EDuration="6.314553026s" podCreationTimestamp="2026-02-24 03:13:53 +0000 UTC" firstStartedPulling="2026-02-24 03:13:54.349282027 +0000 UTC m=+1158.366352840" lastFinishedPulling="2026-02-24 03:13:58.790200798 +0000 UTC m=+1162.807271611" observedRunningTime="2026-02-24 03:13:59.312580095 +0000 UTC m=+1163.329650908" watchObservedRunningTime="2026-02-24 03:13:59.314553026 +0000 UTC m=+1163.331623839" Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.378981 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.394963 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6dcbd8cd94-497ns" Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.451772 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7fb8677dd-w8wrp"] Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.451972 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7fb8677dd-w8wrp" podUID="260b26fd-552c-4dbb-b181-d423dbd57de2" containerName="horizon-log" containerID="cri-o://56c679cc8bc9a394faf66c327b9d9daede87630b6ecdfd96142d5fbe328abf7f" gracePeriod=30 Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.452112 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7fb8677dd-w8wrp" podUID="260b26fd-552c-4dbb-b181-d423dbd57de2" containerName="horizon" containerID="cri-o://eb6dc2bd50631ac3c6bcb24200408fc8c381aff3618506834cab3ba070373112" gracePeriod=30 Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.514091 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-config\") pod \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.514166 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ff4c\" (UniqueName: \"kubernetes.io/projected/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-kube-api-access-7ff4c\") pod \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.514266 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-dns-swift-storage-0\") pod \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.514347 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-ovsdbserver-nb\") pod \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.514388 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-ovsdbserver-sb\") pod \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.514412 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-dns-svc\") pod \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\" (UID: \"d2108dcb-bf35-433c-8fc3-49a4e63da0fe\") " Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.533169 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-kube-api-access-7ff4c" (OuterVolumeSpecName: "kube-api-access-7ff4c") pod "d2108dcb-bf35-433c-8fc3-49a4e63da0fe" (UID: "d2108dcb-bf35-433c-8fc3-49a4e63da0fe"). InnerVolumeSpecName "kube-api-access-7ff4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.592267 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d2108dcb-bf35-433c-8fc3-49a4e63da0fe" (UID: "d2108dcb-bf35-433c-8fc3-49a4e63da0fe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.603768 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d2108dcb-bf35-433c-8fc3-49a4e63da0fe" (UID: "d2108dcb-bf35-433c-8fc3-49a4e63da0fe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.617839 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ff4c\" (UniqueName: \"kubernetes.io/projected/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-kube-api-access-7ff4c\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.617876 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.617890 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.623029 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-config" (OuterVolumeSpecName: "config") pod "d2108dcb-bf35-433c-8fc3-49a4e63da0fe" (UID: "d2108dcb-bf35-433c-8fc3-49a4e63da0fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.633838 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d2108dcb-bf35-433c-8fc3-49a4e63da0fe" (UID: "d2108dcb-bf35-433c-8fc3-49a4e63da0fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.648758 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d2108dcb-bf35-433c-8fc3-49a4e63da0fe" (UID: "d2108dcb-bf35-433c-8fc3-49a4e63da0fe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.719335 4923 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.719367 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:13:59 crc kubenswrapper[4923]: I0224 03:13:59.719377 4923 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2108dcb-bf35-433c-8fc3-49a4e63da0fe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:00 crc kubenswrapper[4923]: I0224 03:14:00.138639 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:14:00 crc kubenswrapper[4923]: I0224 03:14:00.310051 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" event={"ID":"d2108dcb-bf35-433c-8fc3-49a4e63da0fe","Type":"ContainerDied","Data":"6b2e48f2b99f01a82bf1a45f4f756808c7a57e105cceab31161fc1282cc82292"} Feb 24 03:14:00 crc kubenswrapper[4923]: I0224 03:14:00.310085 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" Feb 24 03:14:00 crc kubenswrapper[4923]: I0224 03:14:00.310100 4923 scope.go:117] "RemoveContainer" containerID="ed57864f95ca8fb86656872b4e3ade157b262bc5e879a44e51bf9610c281373c" Feb 24 03:14:00 crc kubenswrapper[4923]: I0224 03:14:00.320441 4923 generic.go:334] "Generic (PLEG): container finished" podID="78b7cd41-26f8-4681-9b0a-adee15dfd6ec" containerID="7d683f297b3a6d01c0897f9207f85fdb2d04e893eca5c425990991411129266d" exitCode=0 Feb 24 03:14:00 crc kubenswrapper[4923]: I0224 03:14:00.321439 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"78b7cd41-26f8-4681-9b0a-adee15dfd6ec","Type":"ContainerDied","Data":"7d683f297b3a6d01c0897f9207f85fdb2d04e893eca5c425990991411129266d"} Feb 24 03:14:00 crc kubenswrapper[4923]: I0224 03:14:00.342985 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-zxvbv"] Feb 24 03:14:00 crc kubenswrapper[4923]: I0224 03:14:00.348467 4923 scope.go:117] "RemoveContainer" containerID="9be5e10d3fceb9fd20b5323aec018efe9769722e4f8fa9b42d6883c406138f66" Feb 24 03:14:00 crc kubenswrapper[4923]: I0224 03:14:00.353563 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-zxvbv"] Feb 24 03:14:00 crc kubenswrapper[4923]: I0224 03:14:00.354240 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d7c54dbbb-xcg2j" Feb 24 03:14:00 crc kubenswrapper[4923]: I0224 03:14:00.419262 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7b857749c4-v7hhc"] Feb 24 03:14:00 crc kubenswrapper[4923]: I0224 03:14:00.419657 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7b857749c4-v7hhc" podUID="28a2368e-86ca-4e7e-a681-291c4b6a0225" containerName="barbican-api" containerID="cri-o://5b61863cf13c708b62925c638604de330b50db6102297c10d82e9fdfe777778f" gracePeriod=30 Feb 24 03:14:00 crc kubenswrapper[4923]: I0224 03:14:00.419542 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7b857749c4-v7hhc" podUID="28a2368e-86ca-4e7e-a681-291c4b6a0225" containerName="barbican-api-log" containerID="cri-o://48d21d070e24f11f9d5e9a688b4682e271548878d75e0eca9de462b73cf2a45a" gracePeriod=30 Feb 24 03:14:00 crc kubenswrapper[4923]: I0224 03:14:00.454531 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7b857749c4-v7hhc" podUID="28a2368e-86ca-4e7e-a681-291c4b6a0225" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": EOF" Feb 24 03:14:00 crc kubenswrapper[4923]: I0224 03:14:00.455280 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7b857749c4-v7hhc" podUID="28a2368e-86ca-4e7e-a681-291c4b6a0225" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": EOF" Feb 24 03:14:01 crc kubenswrapper[4923]: I0224 03:14:01.199630 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-557784489b-tj6xd" Feb 24 03:14:01 crc kubenswrapper[4923]: I0224 03:14:01.329479 4923 generic.go:334] "Generic (PLEG): container finished" podID="28a2368e-86ca-4e7e-a681-291c4b6a0225" containerID="48d21d070e24f11f9d5e9a688b4682e271548878d75e0eca9de462b73cf2a45a" exitCode=143 Feb 24 03:14:01 crc kubenswrapper[4923]: I0224 03:14:01.329560 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b857749c4-v7hhc" event={"ID":"28a2368e-86ca-4e7e-a681-291c4b6a0225","Type":"ContainerDied","Data":"48d21d070e24f11f9d5e9a688b4682e271548878d75e0eca9de462b73cf2a45a"} Feb 24 03:14:01 crc kubenswrapper[4923]: I0224 03:14:01.628929 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-557784489b-tj6xd" Feb 24 03:14:01 crc kubenswrapper[4923]: I0224 03:14:01.726995 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2108dcb-bf35-433c-8fc3-49a4e63da0fe" path="/var/lib/kubelet/pods/d2108dcb-bf35-433c-8fc3-49a4e63da0fe/volumes" Feb 24 03:14:01 crc kubenswrapper[4923]: I0224 03:14:01.847641 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9d7999766-h8pkz"] Feb 24 03:14:01 crc kubenswrapper[4923]: E0224 03:14:01.848316 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2108dcb-bf35-433c-8fc3-49a4e63da0fe" containerName="init" Feb 24 03:14:01 crc kubenswrapper[4923]: I0224 03:14:01.848335 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2108dcb-bf35-433c-8fc3-49a4e63da0fe" containerName="init" Feb 24 03:14:01 crc kubenswrapper[4923]: E0224 03:14:01.848351 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2108dcb-bf35-433c-8fc3-49a4e63da0fe" containerName="dnsmasq-dns" Feb 24 03:14:01 crc kubenswrapper[4923]: I0224 03:14:01.848358 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2108dcb-bf35-433c-8fc3-49a4e63da0fe" containerName="dnsmasq-dns" Feb 24 03:14:01 crc kubenswrapper[4923]: I0224 03:14:01.848520 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2108dcb-bf35-433c-8fc3-49a4e63da0fe" containerName="dnsmasq-dns" Feb 24 03:14:01 crc kubenswrapper[4923]: I0224 03:14:01.849346 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:01 crc kubenswrapper[4923]: I0224 03:14:01.881054 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9d7999766-h8pkz"] Feb 24 03:14:01 crc kubenswrapper[4923]: I0224 03:14:01.981861 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-config-data\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:01 crc kubenswrapper[4923]: I0224 03:14:01.981955 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-combined-ca-bundle\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:01 crc kubenswrapper[4923]: I0224 03:14:01.982013 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhvq4\" (UniqueName: \"kubernetes.io/projected/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-kube-api-access-dhvq4\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:01 crc kubenswrapper[4923]: I0224 03:14:01.982057 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-logs\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:01 crc kubenswrapper[4923]: I0224 03:14:01.982451 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-scripts\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:01 crc kubenswrapper[4923]: I0224 03:14:01.982574 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-public-tls-certs\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:01 crc kubenswrapper[4923]: I0224 03:14:01.982900 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-internal-tls-certs\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.085869 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-config-data\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.085937 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-combined-ca-bundle\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.086032 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhvq4\" (UniqueName: \"kubernetes.io/projected/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-kube-api-access-dhvq4\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.086065 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-logs\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.086143 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-scripts\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.086171 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-public-tls-certs\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.086221 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-internal-tls-certs\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.087081 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-logs\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.092051 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-combined-ca-bundle\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.094854 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-internal-tls-certs\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.094866 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-config-data\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.096567 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-public-tls-certs\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.098630 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-scripts\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.107901 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhvq4\" (UniqueName: \"kubernetes.io/projected/88a4bad2-fdbb-4186-b218-093ff0cf4b9c-kube-api-access-dhvq4\") pod \"placement-9d7999766-h8pkz\" (UID: \"88a4bad2-fdbb-4186-b218-093ff0cf4b9c\") " pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.165020 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.354723 4923 generic.go:334] "Generic (PLEG): container finished" podID="c2b74cd8-a2f1-4db6-b604-1add73452a54" containerID="cb7411d44f45defbd1929cfc1eae6d03e59fdb9ab8d24efc07231c5658ea4b54" exitCode=0 Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.354766 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86d47cfd49-7hzln" event={"ID":"c2b74cd8-a2f1-4db6-b604-1add73452a54","Type":"ContainerDied","Data":"cb7411d44f45defbd1929cfc1eae6d03e59fdb9ab8d24efc07231c5658ea4b54"} Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.445040 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.598757 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-config\") pod \"c2b74cd8-a2f1-4db6-b604-1add73452a54\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.598852 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbv92\" (UniqueName: \"kubernetes.io/projected/c2b74cd8-a2f1-4db6-b604-1add73452a54-kube-api-access-pbv92\") pod \"c2b74cd8-a2f1-4db6-b604-1add73452a54\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.598910 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-public-tls-certs\") pod \"c2b74cd8-a2f1-4db6-b604-1add73452a54\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.598947 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-httpd-config\") pod \"c2b74cd8-a2f1-4db6-b604-1add73452a54\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.599045 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-ovndb-tls-certs\") pod \"c2b74cd8-a2f1-4db6-b604-1add73452a54\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.599091 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-combined-ca-bundle\") pod \"c2b74cd8-a2f1-4db6-b604-1add73452a54\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.599115 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-internal-tls-certs\") pod \"c2b74cd8-a2f1-4db6-b604-1add73452a54\" (UID: \"c2b74cd8-a2f1-4db6-b604-1add73452a54\") " Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.604993 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c2b74cd8-a2f1-4db6-b604-1add73452a54" (UID: "c2b74cd8-a2f1-4db6-b604-1add73452a54"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.611147 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2b74cd8-a2f1-4db6-b604-1add73452a54-kube-api-access-pbv92" (OuterVolumeSpecName: "kube-api-access-pbv92") pod "c2b74cd8-a2f1-4db6-b604-1add73452a54" (UID: "c2b74cd8-a2f1-4db6-b604-1add73452a54"). InnerVolumeSpecName "kube-api-access-pbv92". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.677817 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c2b74cd8-a2f1-4db6-b604-1add73452a54" (UID: "c2b74cd8-a2f1-4db6-b604-1add73452a54"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.681235 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-config" (OuterVolumeSpecName: "config") pod "c2b74cd8-a2f1-4db6-b604-1add73452a54" (UID: "c2b74cd8-a2f1-4db6-b604-1add73452a54"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.682942 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9d7999766-h8pkz"] Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.687743 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c2b74cd8-a2f1-4db6-b604-1add73452a54" (UID: "c2b74cd8-a2f1-4db6-b604-1add73452a54"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.696045 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c2b74cd8-a2f1-4db6-b604-1add73452a54" (UID: "c2b74cd8-a2f1-4db6-b604-1add73452a54"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.700457 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2b74cd8-a2f1-4db6-b604-1add73452a54" (UID: "c2b74cd8-a2f1-4db6-b604-1add73452a54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.709786 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.709819 4923 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.709832 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.709843 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbv92\" (UniqueName: \"kubernetes.io/projected/c2b74cd8-a2f1-4db6-b604-1add73452a54-kube-api-access-pbv92\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.709854 4923 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.709862 4923 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:02 crc kubenswrapper[4923]: I0224 03:14:02.709871 4923 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2b74cd8-a2f1-4db6-b604-1add73452a54-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.322973 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.368543 4923 generic.go:334] "Generic (PLEG): container finished" podID="260b26fd-552c-4dbb-b181-d423dbd57de2" containerID="eb6dc2bd50631ac3c6bcb24200408fc8c381aff3618506834cab3ba070373112" exitCode=0 Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.368628 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb8677dd-w8wrp" event={"ID":"260b26fd-552c-4dbb-b181-d423dbd57de2","Type":"ContainerDied","Data":"eb6dc2bd50631ac3c6bcb24200408fc8c381aff3618506834cab3ba070373112"} Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.370275 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9d7999766-h8pkz" event={"ID":"88a4bad2-fdbb-4186-b218-093ff0cf4b9c","Type":"ContainerStarted","Data":"32b7fa5e47b73f51751a882e95b3c1db855cdc16175aa726ab06b01eb98b2c44"} Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.370326 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9d7999766-h8pkz" event={"ID":"88a4bad2-fdbb-4186-b218-093ff0cf4b9c","Type":"ContainerStarted","Data":"23b6a2f7f95b75a09fcbb82b1e4a6ead9ff5c64e3a5abbdcc1e696ab88774eea"} Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.370338 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9d7999766-h8pkz" event={"ID":"88a4bad2-fdbb-4186-b218-093ff0cf4b9c","Type":"ContainerStarted","Data":"befac31f7624f1633e1c1702d03f93c38bc6c94fceae8da9dfa60e80224a70f8"} Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.371408 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.371437 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.374713 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86d47cfd49-7hzln" event={"ID":"c2b74cd8-a2f1-4db6-b604-1add73452a54","Type":"ContainerDied","Data":"2577b1188f7c0e3f34a30b1a81fe21830f3df1398416983be539a4ae6020abfb"} Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.374760 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86d47cfd49-7hzln" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.374764 4923 scope.go:117] "RemoveContainer" containerID="53b8b575df0335247e17f77ae72a9c76ce47979f5ebab2910de76cb5d115a980" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.386788 4923 generic.go:334] "Generic (PLEG): container finished" podID="78b7cd41-26f8-4681-9b0a-adee15dfd6ec" containerID="d43d99a3dfc4f6106f152b2d301cb1f9e7c1e9f6be1676ce1c85469e4069f201" exitCode=0 Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.386838 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"78b7cd41-26f8-4681-9b0a-adee15dfd6ec","Type":"ContainerDied","Data":"d43d99a3dfc4f6106f152b2d301cb1f9e7c1e9f6be1676ce1c85469e4069f201"} Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.386869 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"78b7cd41-26f8-4681-9b0a-adee15dfd6ec","Type":"ContainerDied","Data":"469ca1415ace91495986e8cea47f23b0492edc800984d927ee8289434fd11df3"} Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.386930 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.396070 4923 scope.go:117] "RemoveContainer" containerID="cb7411d44f45defbd1929cfc1eae6d03e59fdb9ab8d24efc07231c5658ea4b54" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.399332 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-9d7999766-h8pkz" podStartSLOduration=2.399312089 podStartE2EDuration="2.399312089s" podCreationTimestamp="2026-02-24 03:14:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:14:03.387958753 +0000 UTC m=+1167.405029566" watchObservedRunningTime="2026-02-24 03:14:03.399312089 +0000 UTC m=+1167.416382902" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.419369 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-86d47cfd49-7hzln"] Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.424168 4923 scope.go:117] "RemoveContainer" containerID="7d683f297b3a6d01c0897f9207f85fdb2d04e893eca5c425990991411129266d" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.431431 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-86d47cfd49-7hzln"] Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.442280 4923 scope.go:117] "RemoveContainer" containerID="d43d99a3dfc4f6106f152b2d301cb1f9e7c1e9f6be1676ce1c85469e4069f201" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.470806 4923 scope.go:117] "RemoveContainer" containerID="7d683f297b3a6d01c0897f9207f85fdb2d04e893eca5c425990991411129266d" Feb 24 03:14:03 crc kubenswrapper[4923]: E0224 03:14:03.474333 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d683f297b3a6d01c0897f9207f85fdb2d04e893eca5c425990991411129266d\": container with ID starting with 7d683f297b3a6d01c0897f9207f85fdb2d04e893eca5c425990991411129266d not found: ID does not exist" containerID="7d683f297b3a6d01c0897f9207f85fdb2d04e893eca5c425990991411129266d" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.474385 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d683f297b3a6d01c0897f9207f85fdb2d04e893eca5c425990991411129266d"} err="failed to get container status \"7d683f297b3a6d01c0897f9207f85fdb2d04e893eca5c425990991411129266d\": rpc error: code = NotFound desc = could not find container \"7d683f297b3a6d01c0897f9207f85fdb2d04e893eca5c425990991411129266d\": container with ID starting with 7d683f297b3a6d01c0897f9207f85fdb2d04e893eca5c425990991411129266d not found: ID does not exist" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.474413 4923 scope.go:117] "RemoveContainer" containerID="d43d99a3dfc4f6106f152b2d301cb1f9e7c1e9f6be1676ce1c85469e4069f201" Feb 24 03:14:03 crc kubenswrapper[4923]: E0224 03:14:03.474718 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d43d99a3dfc4f6106f152b2d301cb1f9e7c1e9f6be1676ce1c85469e4069f201\": container with ID starting with d43d99a3dfc4f6106f152b2d301cb1f9e7c1e9f6be1676ce1c85469e4069f201 not found: ID does not exist" containerID="d43d99a3dfc4f6106f152b2d301cb1f9e7c1e9f6be1676ce1c85469e4069f201" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.474771 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d43d99a3dfc4f6106f152b2d301cb1f9e7c1e9f6be1676ce1c85469e4069f201"} err="failed to get container status \"d43d99a3dfc4f6106f152b2d301cb1f9e7c1e9f6be1676ce1c85469e4069f201\": rpc error: code = NotFound desc = could not find container \"d43d99a3dfc4f6106f152b2d301cb1f9e7c1e9f6be1676ce1c85469e4069f201\": container with ID starting with d43d99a3dfc4f6106f152b2d301cb1f9e7c1e9f6be1676ce1c85469e4069f201 not found: ID does not exist" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.523748 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-config-data-custom\") pod \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.523897 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxfzn\" (UniqueName: \"kubernetes.io/projected/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-kube-api-access-kxfzn\") pod \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.525348 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-config-data\") pod \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.525591 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-combined-ca-bundle\") pod \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.525705 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-scripts\") pod \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.525791 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-etc-machine-id\") pod \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\" (UID: \"78b7cd41-26f8-4681-9b0a-adee15dfd6ec\") " Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.525893 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "78b7cd41-26f8-4681-9b0a-adee15dfd6ec" (UID: "78b7cd41-26f8-4681-9b0a-adee15dfd6ec"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.527805 4923 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.529545 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-scripts" (OuterVolumeSpecName: "scripts") pod "78b7cd41-26f8-4681-9b0a-adee15dfd6ec" (UID: "78b7cd41-26f8-4681-9b0a-adee15dfd6ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.529605 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-kube-api-access-kxfzn" (OuterVolumeSpecName: "kube-api-access-kxfzn") pod "78b7cd41-26f8-4681-9b0a-adee15dfd6ec" (UID: "78b7cd41-26f8-4681-9b0a-adee15dfd6ec"). InnerVolumeSpecName "kube-api-access-kxfzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.530194 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "78b7cd41-26f8-4681-9b0a-adee15dfd6ec" (UID: "78b7cd41-26f8-4681-9b0a-adee15dfd6ec"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.584994 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78b7cd41-26f8-4681-9b0a-adee15dfd6ec" (UID: "78b7cd41-26f8-4681-9b0a-adee15dfd6ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.628430 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.628474 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.628484 4923 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.628493 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxfzn\" (UniqueName: \"kubernetes.io/projected/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-kube-api-access-kxfzn\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.652076 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-config-data" (OuterVolumeSpecName: "config-data") pod "78b7cd41-26f8-4681-9b0a-adee15dfd6ec" (UID: "78b7cd41-26f8-4681-9b0a-adee15dfd6ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.729166 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b7cd41-26f8-4681-9b0a-adee15dfd6ec-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.737679 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2b74cd8-a2f1-4db6-b604-1add73452a54" path="/var/lib/kubelet/pods/c2b74cd8-a2f1-4db6-b604-1add73452a54/volumes" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.740258 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.743594 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.758981 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 24 03:14:03 crc kubenswrapper[4923]: E0224 03:14:03.759493 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b7cd41-26f8-4681-9b0a-adee15dfd6ec" containerName="probe" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.759521 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b7cd41-26f8-4681-9b0a-adee15dfd6ec" containerName="probe" Feb 24 03:14:03 crc kubenswrapper[4923]: E0224 03:14:03.759543 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b74cd8-a2f1-4db6-b604-1add73452a54" containerName="neutron-api" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.759553 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b74cd8-a2f1-4db6-b604-1add73452a54" containerName="neutron-api" Feb 24 03:14:03 crc kubenswrapper[4923]: E0224 03:14:03.759591 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b74cd8-a2f1-4db6-b604-1add73452a54" containerName="neutron-httpd" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.759601 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b74cd8-a2f1-4db6-b604-1add73452a54" containerName="neutron-httpd" Feb 24 03:14:03 crc kubenswrapper[4923]: E0224 03:14:03.759621 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b7cd41-26f8-4681-9b0a-adee15dfd6ec" containerName="cinder-scheduler" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.759643 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b7cd41-26f8-4681-9b0a-adee15dfd6ec" containerName="cinder-scheduler" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.759848 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2b74cd8-a2f1-4db6-b604-1add73452a54" containerName="neutron-httpd" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.759879 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b7cd41-26f8-4681-9b0a-adee15dfd6ec" containerName="cinder-scheduler" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.759900 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2b74cd8-a2f1-4db6-b604-1add73452a54" containerName="neutron-api" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.759912 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b7cd41-26f8-4681-9b0a-adee15dfd6ec" containerName="probe" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.760936 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.763411 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.773014 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7fb8677dd-w8wrp" podUID="260b26fd-552c-4dbb-b181-d423dbd57de2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.777872 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.830732 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnxs5\" (UniqueName: \"kubernetes.io/projected/6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf-kube-api-access-jnxs5\") pod \"cinder-scheduler-0\" (UID: \"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf\") " pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.830777 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf\") " pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.830835 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf\") " pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.830965 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf-scripts\") pod \"cinder-scheduler-0\" (UID: \"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf\") " pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.831011 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf\") " pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.831106 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf-config-data\") pod \"cinder-scheduler-0\" (UID: \"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf\") " pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.933227 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf\") " pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.933282 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf-scripts\") pod \"cinder-scheduler-0\" (UID: \"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf\") " pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.933317 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf\") " pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.933343 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf-config-data\") pod \"cinder-scheduler-0\" (UID: \"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf\") " pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.933433 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnxs5\" (UniqueName: \"kubernetes.io/projected/6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf-kube-api-access-jnxs5\") pod \"cinder-scheduler-0\" (UID: \"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf\") " pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.933434 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf\") " pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.933458 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf\") " pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.938235 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf\") " pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.938454 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf-scripts\") pod \"cinder-scheduler-0\" (UID: \"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf\") " pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.938578 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf-config-data\") pod \"cinder-scheduler-0\" (UID: \"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf\") " pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.939008 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf\") " pod="openstack/cinder-scheduler-0" Feb 24 03:14:03 crc kubenswrapper[4923]: I0224 03:14:03.952745 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnxs5\" (UniqueName: \"kubernetes.io/projected/6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf-kube-api-access-jnxs5\") pod \"cinder-scheduler-0\" (UID: \"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf\") " pod="openstack/cinder-scheduler-0" Feb 24 03:14:04 crc kubenswrapper[4923]: I0224 03:14:04.087786 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 24 03:14:04 crc kubenswrapper[4923]: I0224 03:14:04.106152 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b7b667979-zxvbv" podUID="d2108dcb-bf35-433c-8fc3-49a4e63da0fe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.155:5353: i/o timeout" Feb 24 03:14:04 crc kubenswrapper[4923]: W0224 03:14:04.552232 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ba44e97_ed2b_4e52_8f38_3279a2fdb3bf.slice/crio-345a0a6757028b3abd1a67fd53e8a83eed29bbd5b9594215ffb0cb5e9617f61d WatchSource:0}: Error finding container 345a0a6757028b3abd1a67fd53e8a83eed29bbd5b9594215ffb0cb5e9617f61d: Status 404 returned error can't find the container with id 345a0a6757028b3abd1a67fd53e8a83eed29bbd5b9594215ffb0cb5e9617f61d Feb 24 03:14:04 crc kubenswrapper[4923]: I0224 03:14:04.574365 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 24 03:14:04 crc kubenswrapper[4923]: I0224 03:14:04.880118 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b857749c4-v7hhc" podUID="28a2368e-86ca-4e7e-a681-291c4b6a0225" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:57340->10.217.0.163:9311: read: connection reset by peer" Feb 24 03:14:04 crc kubenswrapper[4923]: I0224 03:14:04.880465 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b857749c4-v7hhc" podUID="28a2368e-86ca-4e7e-a681-291c4b6a0225" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:57324->10.217.0.163:9311: read: connection reset by peer" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.305410 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.420413 4923 generic.go:334] "Generic (PLEG): container finished" podID="28a2368e-86ca-4e7e-a681-291c4b6a0225" containerID="5b61863cf13c708b62925c638604de330b50db6102297c10d82e9fdfe777778f" exitCode=0 Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.420491 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b857749c4-v7hhc" event={"ID":"28a2368e-86ca-4e7e-a681-291c4b6a0225","Type":"ContainerDied","Data":"5b61863cf13c708b62925c638604de330b50db6102297c10d82e9fdfe777778f"} Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.420819 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b857749c4-v7hhc" event={"ID":"28a2368e-86ca-4e7e-a681-291c4b6a0225","Type":"ContainerDied","Data":"11ee44d28f1b11458b1719deebf032c73fdac8aae529f903a0bc6d80634f699a"} Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.420516 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b857749c4-v7hhc" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.420858 4923 scope.go:117] "RemoveContainer" containerID="5b61863cf13c708b62925c638604de330b50db6102297c10d82e9fdfe777778f" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.433391 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf","Type":"ContainerStarted","Data":"881954380cd2ee46f40de2d2642579b1e3b19ef6c22e5e901bd8e718679c5027"} Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.433425 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf","Type":"ContainerStarted","Data":"345a0a6757028b3abd1a67fd53e8a83eed29bbd5b9594215ffb0cb5e9617f61d"} Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.446212 4923 scope.go:117] "RemoveContainer" containerID="48d21d070e24f11f9d5e9a688b4682e271548878d75e0eca9de462b73cf2a45a" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.459835 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28a2368e-86ca-4e7e-a681-291c4b6a0225-logs\") pod \"28a2368e-86ca-4e7e-a681-291c4b6a0225\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.460372 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfjvf\" (UniqueName: \"kubernetes.io/projected/28a2368e-86ca-4e7e-a681-291c4b6a0225-kube-api-access-pfjvf\") pod \"28a2368e-86ca-4e7e-a681-291c4b6a0225\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.460415 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a2368e-86ca-4e7e-a681-291c4b6a0225-config-data\") pod \"28a2368e-86ca-4e7e-a681-291c4b6a0225\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.460457 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28a2368e-86ca-4e7e-a681-291c4b6a0225-config-data-custom\") pod \"28a2368e-86ca-4e7e-a681-291c4b6a0225\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.460744 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28a2368e-86ca-4e7e-a681-291c4b6a0225-logs" (OuterVolumeSpecName: "logs") pod "28a2368e-86ca-4e7e-a681-291c4b6a0225" (UID: "28a2368e-86ca-4e7e-a681-291c4b6a0225"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.460953 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a2368e-86ca-4e7e-a681-291c4b6a0225-combined-ca-bundle\") pod \"28a2368e-86ca-4e7e-a681-291c4b6a0225\" (UID: \"28a2368e-86ca-4e7e-a681-291c4b6a0225\") " Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.461556 4923 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28a2368e-86ca-4e7e-a681-291c4b6a0225-logs\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.467434 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a2368e-86ca-4e7e-a681-291c4b6a0225-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "28a2368e-86ca-4e7e-a681-291c4b6a0225" (UID: "28a2368e-86ca-4e7e-a681-291c4b6a0225"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.467495 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28a2368e-86ca-4e7e-a681-291c4b6a0225-kube-api-access-pfjvf" (OuterVolumeSpecName: "kube-api-access-pfjvf") pod "28a2368e-86ca-4e7e-a681-291c4b6a0225" (UID: "28a2368e-86ca-4e7e-a681-291c4b6a0225"). InnerVolumeSpecName "kube-api-access-pfjvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.473377 4923 scope.go:117] "RemoveContainer" containerID="5b61863cf13c708b62925c638604de330b50db6102297c10d82e9fdfe777778f" Feb 24 03:14:05 crc kubenswrapper[4923]: E0224 03:14:05.474044 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b61863cf13c708b62925c638604de330b50db6102297c10d82e9fdfe777778f\": container with ID starting with 5b61863cf13c708b62925c638604de330b50db6102297c10d82e9fdfe777778f not found: ID does not exist" containerID="5b61863cf13c708b62925c638604de330b50db6102297c10d82e9fdfe777778f" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.474090 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b61863cf13c708b62925c638604de330b50db6102297c10d82e9fdfe777778f"} err="failed to get container status \"5b61863cf13c708b62925c638604de330b50db6102297c10d82e9fdfe777778f\": rpc error: code = NotFound desc = could not find container \"5b61863cf13c708b62925c638604de330b50db6102297c10d82e9fdfe777778f\": container with ID starting with 5b61863cf13c708b62925c638604de330b50db6102297c10d82e9fdfe777778f not found: ID does not exist" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.474118 4923 scope.go:117] "RemoveContainer" containerID="48d21d070e24f11f9d5e9a688b4682e271548878d75e0eca9de462b73cf2a45a" Feb 24 03:14:05 crc kubenswrapper[4923]: E0224 03:14:05.474557 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48d21d070e24f11f9d5e9a688b4682e271548878d75e0eca9de462b73cf2a45a\": container with ID starting with 48d21d070e24f11f9d5e9a688b4682e271548878d75e0eca9de462b73cf2a45a not found: ID does not exist" containerID="48d21d070e24f11f9d5e9a688b4682e271548878d75e0eca9de462b73cf2a45a" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.474584 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48d21d070e24f11f9d5e9a688b4682e271548878d75e0eca9de462b73cf2a45a"} err="failed to get container status \"48d21d070e24f11f9d5e9a688b4682e271548878d75e0eca9de462b73cf2a45a\": rpc error: code = NotFound desc = could not find container \"48d21d070e24f11f9d5e9a688b4682e271548878d75e0eca9de462b73cf2a45a\": container with ID starting with 48d21d070e24f11f9d5e9a688b4682e271548878d75e0eca9de462b73cf2a45a not found: ID does not exist" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.496128 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a2368e-86ca-4e7e-a681-291c4b6a0225-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28a2368e-86ca-4e7e-a681-291c4b6a0225" (UID: "28a2368e-86ca-4e7e-a681-291c4b6a0225"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.525966 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a2368e-86ca-4e7e-a681-291c4b6a0225-config-data" (OuterVolumeSpecName: "config-data") pod "28a2368e-86ca-4e7e-a681-291c4b6a0225" (UID: "28a2368e-86ca-4e7e-a681-291c4b6a0225"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.563123 4923 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28a2368e-86ca-4e7e-a681-291c4b6a0225-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.563395 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a2368e-86ca-4e7e-a681-291c4b6a0225-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.563426 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfjvf\" (UniqueName: \"kubernetes.io/projected/28a2368e-86ca-4e7e-a681-291c4b6a0225-kube-api-access-pfjvf\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.563440 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a2368e-86ca-4e7e-a681-291c4b6a0225-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.740669 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78b7cd41-26f8-4681-9b0a-adee15dfd6ec" path="/var/lib/kubelet/pods/78b7cd41-26f8-4681-9b0a-adee15dfd6ec/volumes" Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.775996 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7b857749c4-v7hhc"] Feb 24 03:14:05 crc kubenswrapper[4923]: I0224 03:14:05.791287 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7b857749c4-v7hhc"] Feb 24 03:14:06 crc kubenswrapper[4923]: I0224 03:14:06.448736 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf","Type":"ContainerStarted","Data":"bbf83c0b4ab21d1f12a092f2a0288f9e74c91d30bba4695045ffc2abec1c9c46"} Feb 24 03:14:06 crc kubenswrapper[4923]: I0224 03:14:06.477092 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.477070774 podStartE2EDuration="3.477070774s" podCreationTimestamp="2026-02-24 03:14:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:14:06.469605469 +0000 UTC m=+1170.486676292" watchObservedRunningTime="2026-02-24 03:14:06.477070774 +0000 UTC m=+1170.494141587" Feb 24 03:14:07 crc kubenswrapper[4923]: I0224 03:14:07.590773 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 24 03:14:07 crc kubenswrapper[4923]: I0224 03:14:07.723054 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28a2368e-86ca-4e7e-a681-291c4b6a0225" path="/var/lib/kubelet/pods/28a2368e-86ca-4e7e-a681-291c4b6a0225/volumes" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.089004 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.400610 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6cfd87c4f7-w99br" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.835092 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 24 03:14:09 crc kubenswrapper[4923]: E0224 03:14:09.835549 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a2368e-86ca-4e7e-a681-291c4b6a0225" containerName="barbican-api" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.835571 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a2368e-86ca-4e7e-a681-291c4b6a0225" containerName="barbican-api" Feb 24 03:14:09 crc kubenswrapper[4923]: E0224 03:14:09.835602 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a2368e-86ca-4e7e-a681-291c4b6a0225" containerName="barbican-api-log" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.835611 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a2368e-86ca-4e7e-a681-291c4b6a0225" containerName="barbican-api-log" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.835853 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a2368e-86ca-4e7e-a681-291c4b6a0225" containerName="barbican-api-log" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.835894 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a2368e-86ca-4e7e-a681-291c4b6a0225" containerName="barbican-api" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.836719 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.838628 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-cljxc" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.839006 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.843051 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.852881 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.869407 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3-openstack-config-secret\") pod \"openstackclient\" (UID: \"6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3\") " pod="openstack/openstackclient" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.869471 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcbk9\" (UniqueName: \"kubernetes.io/projected/6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3-kube-api-access-dcbk9\") pod \"openstackclient\" (UID: \"6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3\") " pod="openstack/openstackclient" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.869590 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3\") " pod="openstack/openstackclient" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.869687 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3-openstack-config\") pod \"openstackclient\" (UID: \"6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3\") " pod="openstack/openstackclient" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.971524 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3\") " pod="openstack/openstackclient" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.971615 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3-openstack-config\") pod \"openstackclient\" (UID: \"6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3\") " pod="openstack/openstackclient" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.971738 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3-openstack-config-secret\") pod \"openstackclient\" (UID: \"6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3\") " pod="openstack/openstackclient" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.971797 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcbk9\" (UniqueName: \"kubernetes.io/projected/6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3-kube-api-access-dcbk9\") pod \"openstackclient\" (UID: \"6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3\") " pod="openstack/openstackclient" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.972539 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3-openstack-config\") pod \"openstackclient\" (UID: \"6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3\") " pod="openstack/openstackclient" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.977616 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3\") " pod="openstack/openstackclient" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.978742 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3-openstack-config-secret\") pod \"openstackclient\" (UID: \"6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3\") " pod="openstack/openstackclient" Feb 24 03:14:09 crc kubenswrapper[4923]: I0224 03:14:09.995444 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcbk9\" (UniqueName: \"kubernetes.io/projected/6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3-kube-api-access-dcbk9\") pod \"openstackclient\" (UID: \"6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3\") " pod="openstack/openstackclient" Feb 24 03:14:10 crc kubenswrapper[4923]: I0224 03:14:10.155422 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 24 03:14:10 crc kubenswrapper[4923]: I0224 03:14:10.638740 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 24 03:14:11 crc kubenswrapper[4923]: I0224 03:14:11.494238 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3","Type":"ContainerStarted","Data":"7301e940650ac5ce3cbca903efd88dc79825d734cef12ec99636a6d5bea246e7"} Feb 24 03:14:13 crc kubenswrapper[4923]: I0224 03:14:13.767815 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7fb8677dd-w8wrp" podUID="260b26fd-552c-4dbb-b181-d423dbd57de2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Feb 24 03:14:14 crc kubenswrapper[4923]: I0224 03:14:14.314339 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.038089 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-645cdc8bdf-bkt49"] Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.041897 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.055425 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-645cdc8bdf-bkt49"] Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.057520 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.057585 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.057288 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.195232 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28a2632f-7155-4c9e-9767-fcda3ff0688b-run-httpd\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.195599 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28a2632f-7155-4c9e-9767-fcda3ff0688b-internal-tls-certs\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.195670 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a2632f-7155-4c9e-9767-fcda3ff0688b-combined-ca-bundle\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.195714 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/28a2632f-7155-4c9e-9767-fcda3ff0688b-etc-swift\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.195739 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk46x\" (UniqueName: \"kubernetes.io/projected/28a2632f-7155-4c9e-9767-fcda3ff0688b-kube-api-access-pk46x\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.195764 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28a2632f-7155-4c9e-9767-fcda3ff0688b-public-tls-certs\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.195914 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a2632f-7155-4c9e-9767-fcda3ff0688b-config-data\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.195986 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28a2632f-7155-4c9e-9767-fcda3ff0688b-log-httpd\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.297254 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a2632f-7155-4c9e-9767-fcda3ff0688b-combined-ca-bundle\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.297364 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/28a2632f-7155-4c9e-9767-fcda3ff0688b-etc-swift\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.297393 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk46x\" (UniqueName: \"kubernetes.io/projected/28a2632f-7155-4c9e-9767-fcda3ff0688b-kube-api-access-pk46x\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.297427 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28a2632f-7155-4c9e-9767-fcda3ff0688b-public-tls-certs\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.297468 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a2632f-7155-4c9e-9767-fcda3ff0688b-config-data\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.297494 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28a2632f-7155-4c9e-9767-fcda3ff0688b-log-httpd\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.297539 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28a2632f-7155-4c9e-9767-fcda3ff0688b-run-httpd\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.297569 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28a2632f-7155-4c9e-9767-fcda3ff0688b-internal-tls-certs\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.298918 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28a2632f-7155-4c9e-9767-fcda3ff0688b-log-httpd\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.300240 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28a2632f-7155-4c9e-9767-fcda3ff0688b-run-httpd\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.305977 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a2632f-7155-4c9e-9767-fcda3ff0688b-combined-ca-bundle\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.307466 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28a2632f-7155-4c9e-9767-fcda3ff0688b-internal-tls-certs\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.309560 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/28a2632f-7155-4c9e-9767-fcda3ff0688b-etc-swift\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.311053 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28a2632f-7155-4c9e-9767-fcda3ff0688b-config-data\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.314008 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28a2632f-7155-4c9e-9767-fcda3ff0688b-public-tls-certs\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.321048 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk46x\" (UniqueName: \"kubernetes.io/projected/28a2632f-7155-4c9e-9767-fcda3ff0688b-kube-api-access-pk46x\") pod \"swift-proxy-645cdc8bdf-bkt49\" (UID: \"28a2632f-7155-4c9e-9767-fcda3ff0688b\") " pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:15 crc kubenswrapper[4923]: I0224 03:14:15.388258 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:16 crc kubenswrapper[4923]: I0224 03:14:16.656220 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:16 crc kubenswrapper[4923]: I0224 03:14:16.656484 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerName="ceilometer-central-agent" containerID="cri-o://5919387f8f34160d6612365685b906a41eb578b59a086e4c64db02f0f442727a" gracePeriod=30 Feb 24 03:14:16 crc kubenswrapper[4923]: I0224 03:14:16.656540 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerName="sg-core" containerID="cri-o://1f1bc218cb4d79edfad54a297ed32f7856e72c8f62400a264696b02673014592" gracePeriod=30 Feb 24 03:14:16 crc kubenswrapper[4923]: I0224 03:14:16.656609 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerName="ceilometer-notification-agent" containerID="cri-o://c1dfd6f41883eed123c0ecd39d79fe39ad649e65bcf3aa63c4da62a4daf18679" gracePeriod=30 Feb 24 03:14:16 crc kubenswrapper[4923]: I0224 03:14:16.656622 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerName="proxy-httpd" containerID="cri-o://8298593c5109efd36f80cd6f37e412910893e60ae562106e275e501ab8ef2283" gracePeriod=30 Feb 24 03:14:16 crc kubenswrapper[4923]: I0224 03:14:16.663198 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 24 03:14:16 crc kubenswrapper[4923]: I0224 03:14:16.849821 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-2xw9b"] Feb 24 03:14:16 crc kubenswrapper[4923]: I0224 03:14:16.850994 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2xw9b" Feb 24 03:14:16 crc kubenswrapper[4923]: I0224 03:14:16.863531 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2xw9b"] Feb 24 03:14:16 crc kubenswrapper[4923]: I0224 03:14:16.938577 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-c7ns9"] Feb 24 03:14:16 crc kubenswrapper[4923]: I0224 03:14:16.939935 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-c7ns9" Feb 24 03:14:16 crc kubenswrapper[4923]: I0224 03:14:16.948345 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-c7ns9"] Feb 24 03:14:16 crc kubenswrapper[4923]: I0224 03:14:16.979092 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0144-account-create-update-j6dm5"] Feb 24 03:14:16 crc kubenswrapper[4923]: I0224 03:14:16.980320 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0144-account-create-update-j6dm5" Feb 24 03:14:16 crc kubenswrapper[4923]: I0224 03:14:16.982124 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 24 03:14:16 crc kubenswrapper[4923]: I0224 03:14:16.987166 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0144-account-create-update-j6dm5"] Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.029482 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws9ql\" (UniqueName: \"kubernetes.io/projected/395dac89-2c47-446a-8254-0ba691868651-kube-api-access-ws9ql\") pod \"nova-api-db-create-2xw9b\" (UID: \"395dac89-2c47-446a-8254-0ba691868651\") " pod="openstack/nova-api-db-create-2xw9b" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.029591 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/395dac89-2c47-446a-8254-0ba691868651-operator-scripts\") pod \"nova-api-db-create-2xw9b\" (UID: \"395dac89-2c47-446a-8254-0ba691868651\") " pod="openstack/nova-api-db-create-2xw9b" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.132006 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwxgs\" (UniqueName: \"kubernetes.io/projected/d86572db-86bc-4cd2-b551-a66bf5c47c7a-kube-api-access-fwxgs\") pod \"nova-cell0-db-create-c7ns9\" (UID: \"d86572db-86bc-4cd2-b551-a66bf5c47c7a\") " pod="openstack/nova-cell0-db-create-c7ns9" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.132061 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d86572db-86bc-4cd2-b551-a66bf5c47c7a-operator-scripts\") pod \"nova-cell0-db-create-c7ns9\" (UID: \"d86572db-86bc-4cd2-b551-a66bf5c47c7a\") " pod="openstack/nova-cell0-db-create-c7ns9" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.132103 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80844021-cbc2-4e7f-bbd8-1bac3ae22d98-operator-scripts\") pod \"nova-api-0144-account-create-update-j6dm5\" (UID: \"80844021-cbc2-4e7f-bbd8-1bac3ae22d98\") " pod="openstack/nova-api-0144-account-create-update-j6dm5" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.132166 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws9ql\" (UniqueName: \"kubernetes.io/projected/395dac89-2c47-446a-8254-0ba691868651-kube-api-access-ws9ql\") pod \"nova-api-db-create-2xw9b\" (UID: \"395dac89-2c47-446a-8254-0ba691868651\") " pod="openstack/nova-api-db-create-2xw9b" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.132197 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq7mk\" (UniqueName: \"kubernetes.io/projected/80844021-cbc2-4e7f-bbd8-1bac3ae22d98-kube-api-access-xq7mk\") pod \"nova-api-0144-account-create-update-j6dm5\" (UID: \"80844021-cbc2-4e7f-bbd8-1bac3ae22d98\") " pod="openstack/nova-api-0144-account-create-update-j6dm5" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.132251 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/395dac89-2c47-446a-8254-0ba691868651-operator-scripts\") pod \"nova-api-db-create-2xw9b\" (UID: \"395dac89-2c47-446a-8254-0ba691868651\") " pod="openstack/nova-api-db-create-2xw9b" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.133550 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/395dac89-2c47-446a-8254-0ba691868651-operator-scripts\") pod \"nova-api-db-create-2xw9b\" (UID: \"395dac89-2c47-446a-8254-0ba691868651\") " pod="openstack/nova-api-db-create-2xw9b" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.163680 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws9ql\" (UniqueName: \"kubernetes.io/projected/395dac89-2c47-446a-8254-0ba691868651-kube-api-access-ws9ql\") pod \"nova-api-db-create-2xw9b\" (UID: \"395dac89-2c47-446a-8254-0ba691868651\") " pod="openstack/nova-api-db-create-2xw9b" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.163755 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-fde5-account-create-update-bkgr4"] Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.165249 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fde5-account-create-update-bkgr4" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.172960 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-lplvk"] Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.174401 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.177183 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lplvk" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.184714 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2xw9b" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.188106 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fde5-account-create-update-bkgr4"] Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.217201 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lplvk"] Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.237670 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq7mk\" (UniqueName: \"kubernetes.io/projected/80844021-cbc2-4e7f-bbd8-1bac3ae22d98-kube-api-access-xq7mk\") pod \"nova-api-0144-account-create-update-j6dm5\" (UID: \"80844021-cbc2-4e7f-bbd8-1bac3ae22d98\") " pod="openstack/nova-api-0144-account-create-update-j6dm5" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.237956 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5df5\" (UniqueName: \"kubernetes.io/projected/f491572b-dcd8-40f5-96e0-d6393b852858-kube-api-access-x5df5\") pod \"nova-cell1-db-create-lplvk\" (UID: \"f491572b-dcd8-40f5-96e0-d6393b852858\") " pod="openstack/nova-cell1-db-create-lplvk" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.237988 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2465587c-f0e4-4755-acd6-016d2b1a4cbf-operator-scripts\") pod \"nova-cell0-fde5-account-create-update-bkgr4\" (UID: \"2465587c-f0e4-4755-acd6-016d2b1a4cbf\") " pod="openstack/nova-cell0-fde5-account-create-update-bkgr4" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.238010 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzlh7\" (UniqueName: \"kubernetes.io/projected/2465587c-f0e4-4755-acd6-016d2b1a4cbf-kube-api-access-zzlh7\") pod \"nova-cell0-fde5-account-create-update-bkgr4\" (UID: \"2465587c-f0e4-4755-acd6-016d2b1a4cbf\") " pod="openstack/nova-cell0-fde5-account-create-update-bkgr4" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.238252 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwxgs\" (UniqueName: \"kubernetes.io/projected/d86572db-86bc-4cd2-b551-a66bf5c47c7a-kube-api-access-fwxgs\") pod \"nova-cell0-db-create-c7ns9\" (UID: \"d86572db-86bc-4cd2-b551-a66bf5c47c7a\") " pod="openstack/nova-cell0-db-create-c7ns9" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.238345 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d86572db-86bc-4cd2-b551-a66bf5c47c7a-operator-scripts\") pod \"nova-cell0-db-create-c7ns9\" (UID: \"d86572db-86bc-4cd2-b551-a66bf5c47c7a\") " pod="openstack/nova-cell0-db-create-c7ns9" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.238428 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80844021-cbc2-4e7f-bbd8-1bac3ae22d98-operator-scripts\") pod \"nova-api-0144-account-create-update-j6dm5\" (UID: \"80844021-cbc2-4e7f-bbd8-1bac3ae22d98\") " pod="openstack/nova-api-0144-account-create-update-j6dm5" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.238504 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f491572b-dcd8-40f5-96e0-d6393b852858-operator-scripts\") pod \"nova-cell1-db-create-lplvk\" (UID: \"f491572b-dcd8-40f5-96e0-d6393b852858\") " pod="openstack/nova-cell1-db-create-lplvk" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.239191 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80844021-cbc2-4e7f-bbd8-1bac3ae22d98-operator-scripts\") pod \"nova-api-0144-account-create-update-j6dm5\" (UID: \"80844021-cbc2-4e7f-bbd8-1bac3ae22d98\") " pod="openstack/nova-api-0144-account-create-update-j6dm5" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.239218 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d86572db-86bc-4cd2-b551-a66bf5c47c7a-operator-scripts\") pod \"nova-cell0-db-create-c7ns9\" (UID: \"d86572db-86bc-4cd2-b551-a66bf5c47c7a\") " pod="openstack/nova-cell0-db-create-c7ns9" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.252917 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq7mk\" (UniqueName: \"kubernetes.io/projected/80844021-cbc2-4e7f-bbd8-1bac3ae22d98-kube-api-access-xq7mk\") pod \"nova-api-0144-account-create-update-j6dm5\" (UID: \"80844021-cbc2-4e7f-bbd8-1bac3ae22d98\") " pod="openstack/nova-api-0144-account-create-update-j6dm5" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.255056 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwxgs\" (UniqueName: \"kubernetes.io/projected/d86572db-86bc-4cd2-b551-a66bf5c47c7a-kube-api-access-fwxgs\") pod \"nova-cell0-db-create-c7ns9\" (UID: \"d86572db-86bc-4cd2-b551-a66bf5c47c7a\") " pod="openstack/nova-cell0-db-create-c7ns9" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.257143 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-c7ns9" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.298623 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0144-account-create-update-j6dm5" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.339911 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5df5\" (UniqueName: \"kubernetes.io/projected/f491572b-dcd8-40f5-96e0-d6393b852858-kube-api-access-x5df5\") pod \"nova-cell1-db-create-lplvk\" (UID: \"f491572b-dcd8-40f5-96e0-d6393b852858\") " pod="openstack/nova-cell1-db-create-lplvk" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.339966 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2465587c-f0e4-4755-acd6-016d2b1a4cbf-operator-scripts\") pod \"nova-cell0-fde5-account-create-update-bkgr4\" (UID: \"2465587c-f0e4-4755-acd6-016d2b1a4cbf\") " pod="openstack/nova-cell0-fde5-account-create-update-bkgr4" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.339996 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzlh7\" (UniqueName: \"kubernetes.io/projected/2465587c-f0e4-4755-acd6-016d2b1a4cbf-kube-api-access-zzlh7\") pod \"nova-cell0-fde5-account-create-update-bkgr4\" (UID: \"2465587c-f0e4-4755-acd6-016d2b1a4cbf\") " pod="openstack/nova-cell0-fde5-account-create-update-bkgr4" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.340111 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f491572b-dcd8-40f5-96e0-d6393b852858-operator-scripts\") pod \"nova-cell1-db-create-lplvk\" (UID: \"f491572b-dcd8-40f5-96e0-d6393b852858\") " pod="openstack/nova-cell1-db-create-lplvk" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.341053 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f491572b-dcd8-40f5-96e0-d6393b852858-operator-scripts\") pod \"nova-cell1-db-create-lplvk\" (UID: \"f491572b-dcd8-40f5-96e0-d6393b852858\") " pod="openstack/nova-cell1-db-create-lplvk" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.341247 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2465587c-f0e4-4755-acd6-016d2b1a4cbf-operator-scripts\") pod \"nova-cell0-fde5-account-create-update-bkgr4\" (UID: \"2465587c-f0e4-4755-acd6-016d2b1a4cbf\") " pod="openstack/nova-cell0-fde5-account-create-update-bkgr4" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.353673 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-03dc-account-create-update-blt67"] Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.359535 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-03dc-account-create-update-blt67" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.361368 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.362912 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5df5\" (UniqueName: \"kubernetes.io/projected/f491572b-dcd8-40f5-96e0-d6393b852858-kube-api-access-x5df5\") pod \"nova-cell1-db-create-lplvk\" (UID: \"f491572b-dcd8-40f5-96e0-d6393b852858\") " pod="openstack/nova-cell1-db-create-lplvk" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.366575 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzlh7\" (UniqueName: \"kubernetes.io/projected/2465587c-f0e4-4755-acd6-016d2b1a4cbf-kube-api-access-zzlh7\") pod \"nova-cell0-fde5-account-create-update-bkgr4\" (UID: \"2465587c-f0e4-4755-acd6-016d2b1a4cbf\") " pod="openstack/nova-cell0-fde5-account-create-update-bkgr4" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.379537 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-03dc-account-create-update-blt67"] Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.412009 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fde5-account-create-update-bkgr4" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.438885 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lplvk" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.441638 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92b92176-765c-478f-a4df-26d52a903476-operator-scripts\") pod \"nova-cell1-03dc-account-create-update-blt67\" (UID: \"92b92176-765c-478f-a4df-26d52a903476\") " pod="openstack/nova-cell1-03dc-account-create-update-blt67" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.441728 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjhmx\" (UniqueName: \"kubernetes.io/projected/92b92176-765c-478f-a4df-26d52a903476-kube-api-access-wjhmx\") pod \"nova-cell1-03dc-account-create-update-blt67\" (UID: \"92b92176-765c-478f-a4df-26d52a903476\") " pod="openstack/nova-cell1-03dc-account-create-update-blt67" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.543105 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92b92176-765c-478f-a4df-26d52a903476-operator-scripts\") pod \"nova-cell1-03dc-account-create-update-blt67\" (UID: \"92b92176-765c-478f-a4df-26d52a903476\") " pod="openstack/nova-cell1-03dc-account-create-update-blt67" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.543177 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjhmx\" (UniqueName: \"kubernetes.io/projected/92b92176-765c-478f-a4df-26d52a903476-kube-api-access-wjhmx\") pod \"nova-cell1-03dc-account-create-update-blt67\" (UID: \"92b92176-765c-478f-a4df-26d52a903476\") " pod="openstack/nova-cell1-03dc-account-create-update-blt67" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.544576 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92b92176-765c-478f-a4df-26d52a903476-operator-scripts\") pod \"nova-cell1-03dc-account-create-update-blt67\" (UID: \"92b92176-765c-478f-a4df-26d52a903476\") " pod="openstack/nova-cell1-03dc-account-create-update-blt67" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.560076 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjhmx\" (UniqueName: \"kubernetes.io/projected/92b92176-765c-478f-a4df-26d52a903476-kube-api-access-wjhmx\") pod \"nova-cell1-03dc-account-create-update-blt67\" (UID: \"92b92176-765c-478f-a4df-26d52a903476\") " pod="openstack/nova-cell1-03dc-account-create-update-blt67" Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.565515 4923 generic.go:334] "Generic (PLEG): container finished" podID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerID="8298593c5109efd36f80cd6f37e412910893e60ae562106e275e501ab8ef2283" exitCode=0 Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.565547 4923 generic.go:334] "Generic (PLEG): container finished" podID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerID="1f1bc218cb4d79edfad54a297ed32f7856e72c8f62400a264696b02673014592" exitCode=2 Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.565554 4923 generic.go:334] "Generic (PLEG): container finished" podID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerID="5919387f8f34160d6612365685b906a41eb578b59a086e4c64db02f0f442727a" exitCode=0 Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.565575 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4afbe98-b630-4013-83b8-778ddbeb8b27","Type":"ContainerDied","Data":"8298593c5109efd36f80cd6f37e412910893e60ae562106e275e501ab8ef2283"} Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.565599 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4afbe98-b630-4013-83b8-778ddbeb8b27","Type":"ContainerDied","Data":"1f1bc218cb4d79edfad54a297ed32f7856e72c8f62400a264696b02673014592"} Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.565609 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4afbe98-b630-4013-83b8-778ddbeb8b27","Type":"ContainerDied","Data":"5919387f8f34160d6612365685b906a41eb578b59a086e4c64db02f0f442727a"} Feb 24 03:14:17 crc kubenswrapper[4923]: I0224 03:14:17.746753 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-03dc-account-create-update-blt67" Feb 24 03:14:18 crc kubenswrapper[4923]: I0224 03:14:18.575090 4923 generic.go:334] "Generic (PLEG): container finished" podID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerID="c1dfd6f41883eed123c0ecd39d79fe39ad649e65bcf3aa63c4da62a4daf18679" exitCode=0 Feb 24 03:14:18 crc kubenswrapper[4923]: I0224 03:14:18.575136 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4afbe98-b630-4013-83b8-778ddbeb8b27","Type":"ContainerDied","Data":"c1dfd6f41883eed123c0ecd39d79fe39ad649e65bcf3aa63c4da62a4daf18679"} Feb 24 03:14:19 crc kubenswrapper[4923]: I0224 03:14:19.831215 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:14:19 crc kubenswrapper[4923]: I0224 03:14:19.918290 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4afbe98-b630-4013-83b8-778ddbeb8b27-run-httpd\") pod \"e4afbe98-b630-4013-83b8-778ddbeb8b27\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " Feb 24 03:14:19 crc kubenswrapper[4923]: I0224 03:14:19.918530 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-combined-ca-bundle\") pod \"e4afbe98-b630-4013-83b8-778ddbeb8b27\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " Feb 24 03:14:19 crc kubenswrapper[4923]: I0224 03:14:19.918657 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-sg-core-conf-yaml\") pod \"e4afbe98-b630-4013-83b8-778ddbeb8b27\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " Feb 24 03:14:19 crc kubenswrapper[4923]: I0224 03:14:19.918765 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hpbx\" (UniqueName: \"kubernetes.io/projected/e4afbe98-b630-4013-83b8-778ddbeb8b27-kube-api-access-8hpbx\") pod \"e4afbe98-b630-4013-83b8-778ddbeb8b27\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " Feb 24 03:14:19 crc kubenswrapper[4923]: I0224 03:14:19.918876 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-config-data\") pod \"e4afbe98-b630-4013-83b8-778ddbeb8b27\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " Feb 24 03:14:19 crc kubenswrapper[4923]: I0224 03:14:19.918691 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4afbe98-b630-4013-83b8-778ddbeb8b27-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e4afbe98-b630-4013-83b8-778ddbeb8b27" (UID: "e4afbe98-b630-4013-83b8-778ddbeb8b27"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:14:19 crc kubenswrapper[4923]: I0224 03:14:19.928387 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4afbe98-b630-4013-83b8-778ddbeb8b27-kube-api-access-8hpbx" (OuterVolumeSpecName: "kube-api-access-8hpbx") pod "e4afbe98-b630-4013-83b8-778ddbeb8b27" (UID: "e4afbe98-b630-4013-83b8-778ddbeb8b27"). InnerVolumeSpecName "kube-api-access-8hpbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:14:19 crc kubenswrapper[4923]: I0224 03:14:19.951521 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e4afbe98-b630-4013-83b8-778ddbeb8b27" (UID: "e4afbe98-b630-4013-83b8-778ddbeb8b27"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.003395 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4afbe98-b630-4013-83b8-778ddbeb8b27" (UID: "e4afbe98-b630-4013-83b8-778ddbeb8b27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.019583 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-config-data" (OuterVolumeSpecName: "config-data") pod "e4afbe98-b630-4013-83b8-778ddbeb8b27" (UID: "e4afbe98-b630-4013-83b8-778ddbeb8b27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.020273 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-config-data\") pod \"e4afbe98-b630-4013-83b8-778ddbeb8b27\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " Feb 24 03:14:20 crc kubenswrapper[4923]: W0224 03:14:20.020345 4923 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/e4afbe98-b630-4013-83b8-778ddbeb8b27/volumes/kubernetes.io~secret/config-data Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.020497 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-config-data" (OuterVolumeSpecName: "config-data") pod "e4afbe98-b630-4013-83b8-778ddbeb8b27" (UID: "e4afbe98-b630-4013-83b8-778ddbeb8b27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.020483 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-scripts\") pod \"e4afbe98-b630-4013-83b8-778ddbeb8b27\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.020602 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4afbe98-b630-4013-83b8-778ddbeb8b27-log-httpd\") pod \"e4afbe98-b630-4013-83b8-778ddbeb8b27\" (UID: \"e4afbe98-b630-4013-83b8-778ddbeb8b27\") " Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.021087 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4afbe98-b630-4013-83b8-778ddbeb8b27-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e4afbe98-b630-4013-83b8-778ddbeb8b27" (UID: "e4afbe98-b630-4013-83b8-778ddbeb8b27"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.021407 4923 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.021427 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hpbx\" (UniqueName: \"kubernetes.io/projected/e4afbe98-b630-4013-83b8-778ddbeb8b27-kube-api-access-8hpbx\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.021440 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.021448 4923 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4afbe98-b630-4013-83b8-778ddbeb8b27-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.021458 4923 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4afbe98-b630-4013-83b8-778ddbeb8b27-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.021468 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.023078 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-scripts" (OuterVolumeSpecName: "scripts") pod "e4afbe98-b630-4013-83b8-778ddbeb8b27" (UID: "e4afbe98-b630-4013-83b8-778ddbeb8b27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.127681 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4afbe98-b630-4013-83b8-778ddbeb8b27-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.185478 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2xw9b"] Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.196835 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-c7ns9"] Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.208597 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0144-account-create-update-j6dm5"] Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.227598 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lplvk"] Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.300747 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-645cdc8bdf-bkt49"] Feb 24 03:14:20 crc kubenswrapper[4923]: W0224 03:14:20.309510 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28a2632f_7155_4c9e_9767_fcda3ff0688b.slice/crio-83b34e6cb3ae0f261d4a02d1317f8634073946cb5560e36a946d0cd1970887ec WatchSource:0}: Error finding container 83b34e6cb3ae0f261d4a02d1317f8634073946cb5560e36a946d0cd1970887ec: Status 404 returned error can't find the container with id 83b34e6cb3ae0f261d4a02d1317f8634073946cb5560e36a946d0cd1970887ec Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.399488 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-03dc-account-create-update-blt67"] Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.417318 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fde5-account-create-update-bkgr4"] Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.597539 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fde5-account-create-update-bkgr4" event={"ID":"2465587c-f0e4-4755-acd6-016d2b1a4cbf","Type":"ContainerStarted","Data":"2a42eba36ea43ec89bf56ce463e6836c8e152421b8fd137038a81349065d7b18"} Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.598766 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-645cdc8bdf-bkt49" event={"ID":"28a2632f-7155-4c9e-9767-fcda3ff0688b","Type":"ContainerStarted","Data":"83b34e6cb3ae0f261d4a02d1317f8634073946cb5560e36a946d0cd1970887ec"} Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.599882 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-c7ns9" event={"ID":"d86572db-86bc-4cd2-b551-a66bf5c47c7a","Type":"ContainerStarted","Data":"e7df7d2cd61b8870dfd86c46c3600990ba59cf33a03e673835690d4f379d3a75"} Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.600887 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0144-account-create-update-j6dm5" event={"ID":"80844021-cbc2-4e7f-bbd8-1bac3ae22d98","Type":"ContainerStarted","Data":"f7e99bce2e497720c2d59788b7a4fa8da77fcbfa7cd23ab155178199cd82edfd"} Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.601716 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2xw9b" event={"ID":"395dac89-2c47-446a-8254-0ba691868651","Type":"ContainerStarted","Data":"e534eef905a2fba5dbbf3e673711c952655aef2c68fe370a7e18acfe76a05b2c"} Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.604459 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4afbe98-b630-4013-83b8-778ddbeb8b27","Type":"ContainerDied","Data":"55be6b58f77865c8b5e518a89747df4badd0e6f61fc1c797a33338e93a66f35a"} Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.604496 4923 scope.go:117] "RemoveContainer" containerID="8298593c5109efd36f80cd6f37e412910893e60ae562106e275e501ab8ef2283" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.604636 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.606620 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lplvk" event={"ID":"f491572b-dcd8-40f5-96e0-d6393b852858","Type":"ContainerStarted","Data":"1fc759cd59f71815bea1a5c97e442d9cdcd49609d4a5d3e23c79464d0c03eba9"} Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.610833 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-03dc-account-create-update-blt67" event={"ID":"92b92176-765c-478f-a4df-26d52a903476","Type":"ContainerStarted","Data":"739ca21c2c1925fe3cc56deb2f69af9ee2e421ba3345f4779673039c138dee51"} Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.614478 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3","Type":"ContainerStarted","Data":"11d028608ac99b97e0d101e8d6a2d30c58c341b82a13c65eb140f2fe0e68220b"} Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.635046 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.772230435 podStartE2EDuration="11.635029189s" podCreationTimestamp="2026-02-24 03:14:09 +0000 UTC" firstStartedPulling="2026-02-24 03:14:10.646883657 +0000 UTC m=+1174.663954470" lastFinishedPulling="2026-02-24 03:14:19.509682401 +0000 UTC m=+1183.526753224" observedRunningTime="2026-02-24 03:14:20.628405746 +0000 UTC m=+1184.645476559" watchObservedRunningTime="2026-02-24 03:14:20.635029189 +0000 UTC m=+1184.652100002" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.636138 4923 scope.go:117] "RemoveContainer" containerID="1f1bc218cb4d79edfad54a297ed32f7856e72c8f62400a264696b02673014592" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.664928 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.671019 4923 scope.go:117] "RemoveContainer" containerID="c1dfd6f41883eed123c0ecd39d79fe39ad649e65bcf3aa63c4da62a4daf18679" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.674982 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.688751 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:20 crc kubenswrapper[4923]: E0224 03:14:20.689112 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerName="sg-core" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.689129 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerName="sg-core" Feb 24 03:14:20 crc kubenswrapper[4923]: E0224 03:14:20.689147 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerName="ceilometer-notification-agent" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.689154 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerName="ceilometer-notification-agent" Feb 24 03:14:20 crc kubenswrapper[4923]: E0224 03:14:20.689179 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerName="proxy-httpd" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.689185 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerName="proxy-httpd" Feb 24 03:14:20 crc kubenswrapper[4923]: E0224 03:14:20.689196 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerName="ceilometer-central-agent" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.689202 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerName="ceilometer-central-agent" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.689371 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerName="sg-core" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.689389 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerName="ceilometer-central-agent" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.689405 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerName="ceilometer-notification-agent" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.689416 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4afbe98-b630-4013-83b8-778ddbeb8b27" containerName="proxy-httpd" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.690954 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.694074 4923 scope.go:117] "RemoveContainer" containerID="5919387f8f34160d6612365685b906a41eb578b59a086e4c64db02f0f442727a" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.696270 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.696570 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.700182 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.838455 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efcc7805-95e8-4625-b5f9-a1875fbba24d-log-httpd\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.838753 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db492\" (UniqueName: \"kubernetes.io/projected/efcc7805-95e8-4625-b5f9-a1875fbba24d-kube-api-access-db492\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.838783 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.838807 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.838827 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efcc7805-95e8-4625-b5f9-a1875fbba24d-run-httpd\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.838930 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-config-data\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.838961 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-scripts\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.940619 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db492\" (UniqueName: \"kubernetes.io/projected/efcc7805-95e8-4625-b5f9-a1875fbba24d-kube-api-access-db492\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.940679 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.940710 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.940747 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efcc7805-95e8-4625-b5f9-a1875fbba24d-run-httpd\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.940807 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-config-data\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.940828 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-scripts\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.940887 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efcc7805-95e8-4625-b5f9-a1875fbba24d-log-httpd\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.941758 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efcc7805-95e8-4625-b5f9-a1875fbba24d-run-httpd\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.941829 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efcc7805-95e8-4625-b5f9-a1875fbba24d-log-httpd\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.948168 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-config-data\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.953219 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.953831 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.957606 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-scripts\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:20 crc kubenswrapper[4923]: I0224 03:14:20.974061 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db492\" (UniqueName: \"kubernetes.io/projected/efcc7805-95e8-4625-b5f9-a1875fbba24d-kube-api-access-db492\") pod \"ceilometer-0\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " pod="openstack/ceilometer-0" Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.011848 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.508203 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:21 crc kubenswrapper[4923]: W0224 03:14:21.531001 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefcc7805_95e8_4625_b5f9_a1875fbba24d.slice/crio-ed4178af7308a9b36f45629f59fd906f72eabe7104b6178bff1fa9bd22a29016 WatchSource:0}: Error finding container ed4178af7308a9b36f45629f59fd906f72eabe7104b6178bff1fa9bd22a29016: Status 404 returned error can't find the container with id ed4178af7308a9b36f45629f59fd906f72eabe7104b6178bff1fa9bd22a29016 Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.623733 4923 generic.go:334] "Generic (PLEG): container finished" podID="d86572db-86bc-4cd2-b551-a66bf5c47c7a" containerID="cd964862e704eeaad08f08ea2b13d80bf6211d9a083485cc621c49d7a49ef805" exitCode=0 Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.623822 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-c7ns9" event={"ID":"d86572db-86bc-4cd2-b551-a66bf5c47c7a","Type":"ContainerDied","Data":"cd964862e704eeaad08f08ea2b13d80bf6211d9a083485cc621c49d7a49ef805"} Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.626055 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-645cdc8bdf-bkt49" event={"ID":"28a2632f-7155-4c9e-9767-fcda3ff0688b","Type":"ContainerStarted","Data":"3943c0d375da4498b570d9548262c10f16a0853ca84fa9fc77bdc42e60209d82"} Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.626092 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-645cdc8bdf-bkt49" event={"ID":"28a2632f-7155-4c9e-9767-fcda3ff0688b","Type":"ContainerStarted","Data":"726189f305537bc82ee414ac37b92fd362ce02d98fa1cddd2e3262c25638d921"} Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.626501 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.626588 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.628578 4923 generic.go:334] "Generic (PLEG): container finished" podID="f491572b-dcd8-40f5-96e0-d6393b852858" containerID="f2e0dd235d7bc4fb88fb72b7bfb56721c5e7b331d69cd81a9ca05dcfbaa64d84" exitCode=0 Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.628625 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lplvk" event={"ID":"f491572b-dcd8-40f5-96e0-d6393b852858","Type":"ContainerDied","Data":"f2e0dd235d7bc4fb88fb72b7bfb56721c5e7b331d69cd81a9ca05dcfbaa64d84"} Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.631466 4923 generic.go:334] "Generic (PLEG): container finished" podID="80844021-cbc2-4e7f-bbd8-1bac3ae22d98" containerID="618aa6a6ca8bf3941d9f7264ebf924c5e788d9a4cc3d5451ab4cf331a09c9506" exitCode=0 Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.631540 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0144-account-create-update-j6dm5" event={"ID":"80844021-cbc2-4e7f-bbd8-1bac3ae22d98","Type":"ContainerDied","Data":"618aa6a6ca8bf3941d9f7264ebf924c5e788d9a4cc3d5451ab4cf331a09c9506"} Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.633881 4923 generic.go:334] "Generic (PLEG): container finished" podID="92b92176-765c-478f-a4df-26d52a903476" containerID="75d57347f54d6b4778b6aac2f865ba00f5472bb0064d1ce6be80c984c7df2aa1" exitCode=0 Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.634052 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-03dc-account-create-update-blt67" event={"ID":"92b92176-765c-478f-a4df-26d52a903476","Type":"ContainerDied","Data":"75d57347f54d6b4778b6aac2f865ba00f5472bb0064d1ce6be80c984c7df2aa1"} Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.637784 4923 generic.go:334] "Generic (PLEG): container finished" podID="2465587c-f0e4-4755-acd6-016d2b1a4cbf" containerID="bafe4c81209226219c61c81e8f739f9ef46acfd44534f718c33bc0d3edfd688c" exitCode=0 Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.638350 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fde5-account-create-update-bkgr4" event={"ID":"2465587c-f0e4-4755-acd6-016d2b1a4cbf","Type":"ContainerDied","Data":"bafe4c81209226219c61c81e8f739f9ef46acfd44534f718c33bc0d3edfd688c"} Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.643531 4923 generic.go:334] "Generic (PLEG): container finished" podID="395dac89-2c47-446a-8254-0ba691868651" containerID="986592f6665163ed2d2a4d9717fc92bb03a0a2a4b22849ea18722264ec565113" exitCode=0 Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.643605 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2xw9b" event={"ID":"395dac89-2c47-446a-8254-0ba691868651","Type":"ContainerDied","Data":"986592f6665163ed2d2a4d9717fc92bb03a0a2a4b22849ea18722264ec565113"} Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.644942 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efcc7805-95e8-4625-b5f9-a1875fbba24d","Type":"ContainerStarted","Data":"ed4178af7308a9b36f45629f59fd906f72eabe7104b6178bff1fa9bd22a29016"} Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.714008 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-645cdc8bdf-bkt49" podStartSLOduration=7.713987374 podStartE2EDuration="7.713987374s" podCreationTimestamp="2026-02-24 03:14:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:14:21.68471409 +0000 UTC m=+1185.701784903" watchObservedRunningTime="2026-02-24 03:14:21.713987374 +0000 UTC m=+1185.731058187" Feb 24 03:14:21 crc kubenswrapper[4923]: I0224 03:14:21.737756 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4afbe98-b630-4013-83b8-778ddbeb8b27" path="/var/lib/kubelet/pods/e4afbe98-b630-4013-83b8-778ddbeb8b27/volumes" Feb 24 03:14:22 crc kubenswrapper[4923]: I0224 03:14:22.661754 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efcc7805-95e8-4625-b5f9-a1875fbba24d","Type":"ContainerStarted","Data":"fefc683f6457e540773eb465f3220be8bb8e47f82ae4266149fcd5a78cf8aa6f"} Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.167740 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fde5-account-create-update-bkgr4" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.215429 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2465587c-f0e4-4755-acd6-016d2b1a4cbf-operator-scripts\") pod \"2465587c-f0e4-4755-acd6-016d2b1a4cbf\" (UID: \"2465587c-f0e4-4755-acd6-016d2b1a4cbf\") " Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.215641 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzlh7\" (UniqueName: \"kubernetes.io/projected/2465587c-f0e4-4755-acd6-016d2b1a4cbf-kube-api-access-zzlh7\") pod \"2465587c-f0e4-4755-acd6-016d2b1a4cbf\" (UID: \"2465587c-f0e4-4755-acd6-016d2b1a4cbf\") " Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.217924 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2465587c-f0e4-4755-acd6-016d2b1a4cbf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2465587c-f0e4-4755-acd6-016d2b1a4cbf" (UID: "2465587c-f0e4-4755-acd6-016d2b1a4cbf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.244398 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2465587c-f0e4-4755-acd6-016d2b1a4cbf-kube-api-access-zzlh7" (OuterVolumeSpecName: "kube-api-access-zzlh7") pod "2465587c-f0e4-4755-acd6-016d2b1a4cbf" (UID: "2465587c-f0e4-4755-acd6-016d2b1a4cbf"). InnerVolumeSpecName "kube-api-access-zzlh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.320231 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2465587c-f0e4-4755-acd6-016d2b1a4cbf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.320276 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzlh7\" (UniqueName: \"kubernetes.io/projected/2465587c-f0e4-4755-acd6-016d2b1a4cbf-kube-api-access-zzlh7\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.379199 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-03dc-account-create-update-blt67" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.385797 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-c7ns9" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.395571 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lplvk" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.415961 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0144-account-create-update-j6dm5" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.417051 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2xw9b" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.424815 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d86572db-86bc-4cd2-b551-a66bf5c47c7a-operator-scripts\") pod \"d86572db-86bc-4cd2-b551-a66bf5c47c7a\" (UID: \"d86572db-86bc-4cd2-b551-a66bf5c47c7a\") " Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.424859 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f491572b-dcd8-40f5-96e0-d6393b852858-operator-scripts\") pod \"f491572b-dcd8-40f5-96e0-d6393b852858\" (UID: \"f491572b-dcd8-40f5-96e0-d6393b852858\") " Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.424986 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwxgs\" (UniqueName: \"kubernetes.io/projected/d86572db-86bc-4cd2-b551-a66bf5c47c7a-kube-api-access-fwxgs\") pod \"d86572db-86bc-4cd2-b551-a66bf5c47c7a\" (UID: \"d86572db-86bc-4cd2-b551-a66bf5c47c7a\") " Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.425034 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92b92176-765c-478f-a4df-26d52a903476-operator-scripts\") pod \"92b92176-765c-478f-a4df-26d52a903476\" (UID: \"92b92176-765c-478f-a4df-26d52a903476\") " Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.425167 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjhmx\" (UniqueName: \"kubernetes.io/projected/92b92176-765c-478f-a4df-26d52a903476-kube-api-access-wjhmx\") pod \"92b92176-765c-478f-a4df-26d52a903476\" (UID: \"92b92176-765c-478f-a4df-26d52a903476\") " Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.425205 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5df5\" (UniqueName: \"kubernetes.io/projected/f491572b-dcd8-40f5-96e0-d6393b852858-kube-api-access-x5df5\") pod \"f491572b-dcd8-40f5-96e0-d6393b852858\" (UID: \"f491572b-dcd8-40f5-96e0-d6393b852858\") " Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.425339 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d86572db-86bc-4cd2-b551-a66bf5c47c7a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d86572db-86bc-4cd2-b551-a66bf5c47c7a" (UID: "d86572db-86bc-4cd2-b551-a66bf5c47c7a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.425663 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d86572db-86bc-4cd2-b551-a66bf5c47c7a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.425745 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f491572b-dcd8-40f5-96e0-d6393b852858-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f491572b-dcd8-40f5-96e0-d6393b852858" (UID: "f491572b-dcd8-40f5-96e0-d6393b852858"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.426792 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92b92176-765c-478f-a4df-26d52a903476-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92b92176-765c-478f-a4df-26d52a903476" (UID: "92b92176-765c-478f-a4df-26d52a903476"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.429561 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92b92176-765c-478f-a4df-26d52a903476-kube-api-access-wjhmx" (OuterVolumeSpecName: "kube-api-access-wjhmx") pod "92b92176-765c-478f-a4df-26d52a903476" (UID: "92b92176-765c-478f-a4df-26d52a903476"). InnerVolumeSpecName "kube-api-access-wjhmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.429597 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d86572db-86bc-4cd2-b551-a66bf5c47c7a-kube-api-access-fwxgs" (OuterVolumeSpecName: "kube-api-access-fwxgs") pod "d86572db-86bc-4cd2-b551-a66bf5c47c7a" (UID: "d86572db-86bc-4cd2-b551-a66bf5c47c7a"). InnerVolumeSpecName "kube-api-access-fwxgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.429849 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f491572b-dcd8-40f5-96e0-d6393b852858-kube-api-access-x5df5" (OuterVolumeSpecName: "kube-api-access-x5df5") pod "f491572b-dcd8-40f5-96e0-d6393b852858" (UID: "f491572b-dcd8-40f5-96e0-d6393b852858"). InnerVolumeSpecName "kube-api-access-x5df5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.527066 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80844021-cbc2-4e7f-bbd8-1bac3ae22d98-operator-scripts\") pod \"80844021-cbc2-4e7f-bbd8-1bac3ae22d98\" (UID: \"80844021-cbc2-4e7f-bbd8-1bac3ae22d98\") " Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.527105 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws9ql\" (UniqueName: \"kubernetes.io/projected/395dac89-2c47-446a-8254-0ba691868651-kube-api-access-ws9ql\") pod \"395dac89-2c47-446a-8254-0ba691868651\" (UID: \"395dac89-2c47-446a-8254-0ba691868651\") " Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.527132 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/395dac89-2c47-446a-8254-0ba691868651-operator-scripts\") pod \"395dac89-2c47-446a-8254-0ba691868651\" (UID: \"395dac89-2c47-446a-8254-0ba691868651\") " Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.527155 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq7mk\" (UniqueName: \"kubernetes.io/projected/80844021-cbc2-4e7f-bbd8-1bac3ae22d98-kube-api-access-xq7mk\") pod \"80844021-cbc2-4e7f-bbd8-1bac3ae22d98\" (UID: \"80844021-cbc2-4e7f-bbd8-1bac3ae22d98\") " Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.527522 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwxgs\" (UniqueName: \"kubernetes.io/projected/d86572db-86bc-4cd2-b551-a66bf5c47c7a-kube-api-access-fwxgs\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.527538 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92b92176-765c-478f-a4df-26d52a903476-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.527550 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjhmx\" (UniqueName: \"kubernetes.io/projected/92b92176-765c-478f-a4df-26d52a903476-kube-api-access-wjhmx\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.527548 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80844021-cbc2-4e7f-bbd8-1bac3ae22d98-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80844021-cbc2-4e7f-bbd8-1bac3ae22d98" (UID: "80844021-cbc2-4e7f-bbd8-1bac3ae22d98"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.527587 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5df5\" (UniqueName: \"kubernetes.io/projected/f491572b-dcd8-40f5-96e0-d6393b852858-kube-api-access-x5df5\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.527600 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f491572b-dcd8-40f5-96e0-d6393b852858-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.528014 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/395dac89-2c47-446a-8254-0ba691868651-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "395dac89-2c47-446a-8254-0ba691868651" (UID: "395dac89-2c47-446a-8254-0ba691868651"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.531138 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80844021-cbc2-4e7f-bbd8-1bac3ae22d98-kube-api-access-xq7mk" (OuterVolumeSpecName: "kube-api-access-xq7mk") pod "80844021-cbc2-4e7f-bbd8-1bac3ae22d98" (UID: "80844021-cbc2-4e7f-bbd8-1bac3ae22d98"). InnerVolumeSpecName "kube-api-access-xq7mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.535705 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/395dac89-2c47-446a-8254-0ba691868651-kube-api-access-ws9ql" (OuterVolumeSpecName: "kube-api-access-ws9ql") pod "395dac89-2c47-446a-8254-0ba691868651" (UID: "395dac89-2c47-446a-8254-0ba691868651"). InnerVolumeSpecName "kube-api-access-ws9ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.629381 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80844021-cbc2-4e7f-bbd8-1bac3ae22d98-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.629417 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws9ql\" (UniqueName: \"kubernetes.io/projected/395dac89-2c47-446a-8254-0ba691868651-kube-api-access-ws9ql\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.629429 4923 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/395dac89-2c47-446a-8254-0ba691868651-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.629439 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq7mk\" (UniqueName: \"kubernetes.io/projected/80844021-cbc2-4e7f-bbd8-1bac3ae22d98-kube-api-access-xq7mk\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.768192 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7fb8677dd-w8wrp" podUID="260b26fd-552c-4dbb-b181-d423dbd57de2" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.825276 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.859203 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0144-account-create-update-j6dm5" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.859212 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0144-account-create-update-j6dm5" event={"ID":"80844021-cbc2-4e7f-bbd8-1bac3ae22d98","Type":"ContainerDied","Data":"f7e99bce2e497720c2d59788b7a4fa8da77fcbfa7cd23ab155178199cd82edfd"} Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.859268 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7e99bce2e497720c2d59788b7a4fa8da77fcbfa7cd23ab155178199cd82edfd" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.861137 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-03dc-account-create-update-blt67" event={"ID":"92b92176-765c-478f-a4df-26d52a903476","Type":"ContainerDied","Data":"739ca21c2c1925fe3cc56deb2f69af9ee2e421ba3345f4779673039c138dee51"} Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.861162 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="739ca21c2c1925fe3cc56deb2f69af9ee2e421ba3345f4779673039c138dee51" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.861203 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-03dc-account-create-update-blt67" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.878231 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fde5-account-create-update-bkgr4" event={"ID":"2465587c-f0e4-4755-acd6-016d2b1a4cbf","Type":"ContainerDied","Data":"2a42eba36ea43ec89bf56ce463e6836c8e152421b8fd137038a81349065d7b18"} Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.878271 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a42eba36ea43ec89bf56ce463e6836c8e152421b8fd137038a81349065d7b18" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.878381 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fde5-account-create-update-bkgr4" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.879909 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2xw9b" event={"ID":"395dac89-2c47-446a-8254-0ba691868651","Type":"ContainerDied","Data":"e534eef905a2fba5dbbf3e673711c952655aef2c68fe370a7e18acfe76a05b2c"} Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.879940 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e534eef905a2fba5dbbf3e673711c952655aef2c68fe370a7e18acfe76a05b2c" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.879988 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2xw9b" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.886787 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lplvk" event={"ID":"f491572b-dcd8-40f5-96e0-d6393b852858","Type":"ContainerDied","Data":"1fc759cd59f71815bea1a5c97e442d9cdcd49609d4a5d3e23c79464d0c03eba9"} Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.886811 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fc759cd59f71815bea1a5c97e442d9cdcd49609d4a5d3e23c79464d0c03eba9" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.886851 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lplvk" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.895710 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-c7ns9" event={"ID":"d86572db-86bc-4cd2-b551-a66bf5c47c7a","Type":"ContainerDied","Data":"e7df7d2cd61b8870dfd86c46c3600990ba59cf33a03e673835690d4f379d3a75"} Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.895762 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-c7ns9" Feb 24 03:14:23 crc kubenswrapper[4923]: I0224 03:14:23.895747 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7df7d2cd61b8870dfd86c46c3600990ba59cf33a03e673835690d4f379d3a75" Feb 24 03:14:24 crc kubenswrapper[4923]: I0224 03:14:24.793904 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-55b6b875d5-hmfv4" Feb 24 03:14:24 crc kubenswrapper[4923]: I0224 03:14:24.875445 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-57f869f9f6-f2wpq"] Feb 24 03:14:24 crc kubenswrapper[4923]: I0224 03:14:24.876006 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-57f869f9f6-f2wpq" podUID="061ab736-e68d-4053-b8d3-13ab8220ef22" containerName="neutron-api" containerID="cri-o://b34d0a1c313990b17099067d7d4399a6487d36644a99fc842feb91da7815714f" gracePeriod=30 Feb 24 03:14:24 crc kubenswrapper[4923]: I0224 03:14:24.876669 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-57f869f9f6-f2wpq" podUID="061ab736-e68d-4053-b8d3-13ab8220ef22" containerName="neutron-httpd" containerID="cri-o://be8f1d820b924b4e30acc346afdbdc68223f3cc44f3913c62024f7379f36d1de" gracePeriod=30 Feb 24 03:14:24 crc kubenswrapper[4923]: I0224 03:14:24.915852 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efcc7805-95e8-4625-b5f9-a1875fbba24d","Type":"ContainerStarted","Data":"f92151c65f6e3ae177bc83720eef6d047834c82f951440c8683470100531d11c"} Feb 24 03:14:24 crc kubenswrapper[4923]: I0224 03:14:24.915894 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efcc7805-95e8-4625-b5f9-a1875fbba24d","Type":"ContainerStarted","Data":"7c43264b000e6ef530aeb4e5ceb9a9aced5e063ba1332c3b65a39bbf4374cb4b"} Feb 24 03:14:25 crc kubenswrapper[4923]: I0224 03:14:25.399850 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:25 crc kubenswrapper[4923]: I0224 03:14:25.749823 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:25 crc kubenswrapper[4923]: I0224 03:14:25.925940 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57f869f9f6-f2wpq" event={"ID":"061ab736-e68d-4053-b8d3-13ab8220ef22","Type":"ContainerDied","Data":"be8f1d820b924b4e30acc346afdbdc68223f3cc44f3913c62024f7379f36d1de"} Feb 24 03:14:25 crc kubenswrapper[4923]: I0224 03:14:25.925914 4923 generic.go:334] "Generic (PLEG): container finished" podID="061ab736-e68d-4053-b8d3-13ab8220ef22" containerID="be8f1d820b924b4e30acc346afdbdc68223f3cc44f3913c62024f7379f36d1de" exitCode=0 Feb 24 03:14:26 crc kubenswrapper[4923]: I0224 03:14:26.936327 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efcc7805-95e8-4625-b5f9-a1875fbba24d","Type":"ContainerStarted","Data":"4d54e4d45fdd551ab1f8cd4ede765a1378b292e63238c47708eaeb0408c858d5"} Feb 24 03:14:26 crc kubenswrapper[4923]: I0224 03:14:26.936602 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerName="ceilometer-central-agent" containerID="cri-o://fefc683f6457e540773eb465f3220be8bb8e47f82ae4266149fcd5a78cf8aa6f" gracePeriod=30 Feb 24 03:14:26 crc kubenswrapper[4923]: I0224 03:14:26.936885 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 24 03:14:26 crc kubenswrapper[4923]: I0224 03:14:26.937019 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerName="proxy-httpd" containerID="cri-o://4d54e4d45fdd551ab1f8cd4ede765a1378b292e63238c47708eaeb0408c858d5" gracePeriod=30 Feb 24 03:14:26 crc kubenswrapper[4923]: I0224 03:14:26.937111 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerName="sg-core" containerID="cri-o://f92151c65f6e3ae177bc83720eef6d047834c82f951440c8683470100531d11c" gracePeriod=30 Feb 24 03:14:26 crc kubenswrapper[4923]: I0224 03:14:26.937171 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerName="ceilometer-notification-agent" containerID="cri-o://7c43264b000e6ef530aeb4e5ceb9a9aced5e063ba1332c3b65a39bbf4374cb4b" gracePeriod=30 Feb 24 03:14:26 crc kubenswrapper[4923]: I0224 03:14:26.965789 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.360531652 podStartE2EDuration="6.965770373s" podCreationTimestamp="2026-02-24 03:14:20 +0000 UTC" firstStartedPulling="2026-02-24 03:14:21.538148194 +0000 UTC m=+1185.555219007" lastFinishedPulling="2026-02-24 03:14:26.143386915 +0000 UTC m=+1190.160457728" observedRunningTime="2026-02-24 03:14:26.959245113 +0000 UTC m=+1190.976315926" watchObservedRunningTime="2026-02-24 03:14:26.965770373 +0000 UTC m=+1190.982841186" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.417007 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qlhvh"] Feb 24 03:14:27 crc kubenswrapper[4923]: E0224 03:14:27.417413 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d86572db-86bc-4cd2-b551-a66bf5c47c7a" containerName="mariadb-database-create" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.417434 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="d86572db-86bc-4cd2-b551-a66bf5c47c7a" containerName="mariadb-database-create" Feb 24 03:14:27 crc kubenswrapper[4923]: E0224 03:14:27.417450 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92b92176-765c-478f-a4df-26d52a903476" containerName="mariadb-account-create-update" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.417460 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="92b92176-765c-478f-a4df-26d52a903476" containerName="mariadb-account-create-update" Feb 24 03:14:27 crc kubenswrapper[4923]: E0224 03:14:27.417470 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2465587c-f0e4-4755-acd6-016d2b1a4cbf" containerName="mariadb-account-create-update" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.417479 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="2465587c-f0e4-4755-acd6-016d2b1a4cbf" containerName="mariadb-account-create-update" Feb 24 03:14:27 crc kubenswrapper[4923]: E0224 03:14:27.417494 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80844021-cbc2-4e7f-bbd8-1bac3ae22d98" containerName="mariadb-account-create-update" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.417500 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="80844021-cbc2-4e7f-bbd8-1bac3ae22d98" containerName="mariadb-account-create-update" Feb 24 03:14:27 crc kubenswrapper[4923]: E0224 03:14:27.417509 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395dac89-2c47-446a-8254-0ba691868651" containerName="mariadb-database-create" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.417516 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="395dac89-2c47-446a-8254-0ba691868651" containerName="mariadb-database-create" Feb 24 03:14:27 crc kubenswrapper[4923]: E0224 03:14:27.417531 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f491572b-dcd8-40f5-96e0-d6393b852858" containerName="mariadb-database-create" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.417536 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="f491572b-dcd8-40f5-96e0-d6393b852858" containerName="mariadb-database-create" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.417696 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="f491572b-dcd8-40f5-96e0-d6393b852858" containerName="mariadb-database-create" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.417716 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="92b92176-765c-478f-a4df-26d52a903476" containerName="mariadb-account-create-update" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.417729 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="395dac89-2c47-446a-8254-0ba691868651" containerName="mariadb-database-create" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.417748 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="2465587c-f0e4-4755-acd6-016d2b1a4cbf" containerName="mariadb-account-create-update" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.417759 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="80844021-cbc2-4e7f-bbd8-1bac3ae22d98" containerName="mariadb-account-create-update" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.417769 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="d86572db-86bc-4cd2-b551-a66bf5c47c7a" containerName="mariadb-database-create" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.418411 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qlhvh" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.420540 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gg5bz" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.421076 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.421090 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.426502 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qlhvh"] Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.533013 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qlhvh\" (UID: \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\") " pod="openstack/nova-cell0-conductor-db-sync-qlhvh" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.533080 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-scripts\") pod \"nova-cell0-conductor-db-sync-qlhvh\" (UID: \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\") " pod="openstack/nova-cell0-conductor-db-sync-qlhvh" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.533175 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-config-data\") pod \"nova-cell0-conductor-db-sync-qlhvh\" (UID: \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\") " pod="openstack/nova-cell0-conductor-db-sync-qlhvh" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.533212 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tp96\" (UniqueName: \"kubernetes.io/projected/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-kube-api-access-4tp96\") pod \"nova-cell0-conductor-db-sync-qlhvh\" (UID: \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\") " pod="openstack/nova-cell0-conductor-db-sync-qlhvh" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.634275 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-scripts\") pod \"nova-cell0-conductor-db-sync-qlhvh\" (UID: \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\") " pod="openstack/nova-cell0-conductor-db-sync-qlhvh" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.634409 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-config-data\") pod \"nova-cell0-conductor-db-sync-qlhvh\" (UID: \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\") " pod="openstack/nova-cell0-conductor-db-sync-qlhvh" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.634441 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tp96\" (UniqueName: \"kubernetes.io/projected/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-kube-api-access-4tp96\") pod \"nova-cell0-conductor-db-sync-qlhvh\" (UID: \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\") " pod="openstack/nova-cell0-conductor-db-sync-qlhvh" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.634551 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qlhvh\" (UID: \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\") " pod="openstack/nova-cell0-conductor-db-sync-qlhvh" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.639906 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-scripts\") pod \"nova-cell0-conductor-db-sync-qlhvh\" (UID: \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\") " pod="openstack/nova-cell0-conductor-db-sync-qlhvh" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.640039 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-config-data\") pod \"nova-cell0-conductor-db-sync-qlhvh\" (UID: \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\") " pod="openstack/nova-cell0-conductor-db-sync-qlhvh" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.651883 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qlhvh\" (UID: \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\") " pod="openstack/nova-cell0-conductor-db-sync-qlhvh" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.652160 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tp96\" (UniqueName: \"kubernetes.io/projected/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-kube-api-access-4tp96\") pod \"nova-cell0-conductor-db-sync-qlhvh\" (UID: \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\") " pod="openstack/nova-cell0-conductor-db-sync-qlhvh" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.743808 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qlhvh" Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.956645 4923 generic.go:334] "Generic (PLEG): container finished" podID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerID="4d54e4d45fdd551ab1f8cd4ede765a1378b292e63238c47708eaeb0408c858d5" exitCode=0 Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.957124 4923 generic.go:334] "Generic (PLEG): container finished" podID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerID="f92151c65f6e3ae177bc83720eef6d047834c82f951440c8683470100531d11c" exitCode=2 Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.957138 4923 generic.go:334] "Generic (PLEG): container finished" podID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerID="7c43264b000e6ef530aeb4e5ceb9a9aced5e063ba1332c3b65a39bbf4374cb4b" exitCode=0 Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.957172 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efcc7805-95e8-4625-b5f9-a1875fbba24d","Type":"ContainerDied","Data":"4d54e4d45fdd551ab1f8cd4ede765a1378b292e63238c47708eaeb0408c858d5"} Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.957207 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efcc7805-95e8-4625-b5f9-a1875fbba24d","Type":"ContainerDied","Data":"f92151c65f6e3ae177bc83720eef6d047834c82f951440c8683470100531d11c"} Feb 24 03:14:27 crc kubenswrapper[4923]: I0224 03:14:27.957222 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efcc7805-95e8-4625-b5f9-a1875fbba24d","Type":"ContainerDied","Data":"7c43264b000e6ef530aeb4e5ceb9a9aced5e063ba1332c3b65a39bbf4374cb4b"} Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.215281 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qlhvh"] Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.630771 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.758916 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-config-data\") pod \"efcc7805-95e8-4625-b5f9-a1875fbba24d\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.759230 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-combined-ca-bundle\") pod \"efcc7805-95e8-4625-b5f9-a1875fbba24d\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.759272 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-scripts\") pod \"efcc7805-95e8-4625-b5f9-a1875fbba24d\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.759329 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efcc7805-95e8-4625-b5f9-a1875fbba24d-log-httpd\") pod \"efcc7805-95e8-4625-b5f9-a1875fbba24d\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.759353 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db492\" (UniqueName: \"kubernetes.io/projected/efcc7805-95e8-4625-b5f9-a1875fbba24d-kube-api-access-db492\") pod \"efcc7805-95e8-4625-b5f9-a1875fbba24d\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.759436 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efcc7805-95e8-4625-b5f9-a1875fbba24d-run-httpd\") pod \"efcc7805-95e8-4625-b5f9-a1875fbba24d\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.759501 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-sg-core-conf-yaml\") pod \"efcc7805-95e8-4625-b5f9-a1875fbba24d\" (UID: \"efcc7805-95e8-4625-b5f9-a1875fbba24d\") " Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.760492 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efcc7805-95e8-4625-b5f9-a1875fbba24d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "efcc7805-95e8-4625-b5f9-a1875fbba24d" (UID: "efcc7805-95e8-4625-b5f9-a1875fbba24d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.760551 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efcc7805-95e8-4625-b5f9-a1875fbba24d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "efcc7805-95e8-4625-b5f9-a1875fbba24d" (UID: "efcc7805-95e8-4625-b5f9-a1875fbba24d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.775545 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efcc7805-95e8-4625-b5f9-a1875fbba24d-kube-api-access-db492" (OuterVolumeSpecName: "kube-api-access-db492") pod "efcc7805-95e8-4625-b5f9-a1875fbba24d" (UID: "efcc7805-95e8-4625-b5f9-a1875fbba24d"). InnerVolumeSpecName "kube-api-access-db492". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.794445 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-scripts" (OuterVolumeSpecName: "scripts") pod "efcc7805-95e8-4625-b5f9-a1875fbba24d" (UID: "efcc7805-95e8-4625-b5f9-a1875fbba24d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.837307 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "efcc7805-95e8-4625-b5f9-a1875fbba24d" (UID: "efcc7805-95e8-4625-b5f9-a1875fbba24d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.861984 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.863347 4923 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efcc7805-95e8-4625-b5f9-a1875fbba24d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.863581 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db492\" (UniqueName: \"kubernetes.io/projected/efcc7805-95e8-4625-b5f9-a1875fbba24d-kube-api-access-db492\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.863689 4923 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efcc7805-95e8-4625-b5f9-a1875fbba24d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.863758 4923 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.897738 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-config-data" (OuterVolumeSpecName: "config-data") pod "efcc7805-95e8-4625-b5f9-a1875fbba24d" (UID: "efcc7805-95e8-4625-b5f9-a1875fbba24d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.913513 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efcc7805-95e8-4625-b5f9-a1875fbba24d" (UID: "efcc7805-95e8-4625-b5f9-a1875fbba24d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.965431 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.966504 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efcc7805-95e8-4625-b5f9-a1875fbba24d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.969950 4923 generic.go:334] "Generic (PLEG): container finished" podID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerID="fefc683f6457e540773eb465f3220be8bb8e47f82ae4266149fcd5a78cf8aa6f" exitCode=0 Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.970115 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efcc7805-95e8-4625-b5f9-a1875fbba24d","Type":"ContainerDied","Data":"fefc683f6457e540773eb465f3220be8bb8e47f82ae4266149fcd5a78cf8aa6f"} Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.970280 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efcc7805-95e8-4625-b5f9-a1875fbba24d","Type":"ContainerDied","Data":"ed4178af7308a9b36f45629f59fd906f72eabe7104b6178bff1fa9bd22a29016"} Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.970127 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.970326 4923 scope.go:117] "RemoveContainer" containerID="4d54e4d45fdd551ab1f8cd4ede765a1378b292e63238c47708eaeb0408c858d5" Feb 24 03:14:28 crc kubenswrapper[4923]: I0224 03:14:28.980939 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qlhvh" event={"ID":"a5bfccf6-bf49-4a31-9367-afe9f29cbf74","Type":"ContainerStarted","Data":"9b2005a22651404f5e35fb2efcd4a507457f762849a80c7cca376e10f42d1a96"} Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.008486 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.016704 4923 scope.go:117] "RemoveContainer" containerID="f92151c65f6e3ae177bc83720eef6d047834c82f951440c8683470100531d11c" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.021444 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.027679 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:29 crc kubenswrapper[4923]: E0224 03:14:29.028002 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerName="sg-core" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.028018 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerName="sg-core" Feb 24 03:14:29 crc kubenswrapper[4923]: E0224 03:14:29.028035 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerName="ceilometer-central-agent" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.028042 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerName="ceilometer-central-agent" Feb 24 03:14:29 crc kubenswrapper[4923]: E0224 03:14:29.028051 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerName="ceilometer-notification-agent" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.028059 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerName="ceilometer-notification-agent" Feb 24 03:14:29 crc kubenswrapper[4923]: E0224 03:14:29.028088 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerName="proxy-httpd" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.028094 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerName="proxy-httpd" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.028254 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerName="ceilometer-central-agent" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.028263 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerName="proxy-httpd" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.028279 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerName="sg-core" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.028289 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="efcc7805-95e8-4625-b5f9-a1875fbba24d" containerName="ceilometer-notification-agent" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.036289 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.043052 4923 scope.go:117] "RemoveContainer" containerID="7c43264b000e6ef530aeb4e5ceb9a9aced5e063ba1332c3b65a39bbf4374cb4b" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.045102 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.045319 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.066560 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.081724 4923 scope.go:117] "RemoveContainer" containerID="fefc683f6457e540773eb465f3220be8bb8e47f82ae4266149fcd5a78cf8aa6f" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.143460 4923 scope.go:117] "RemoveContainer" containerID="4d54e4d45fdd551ab1f8cd4ede765a1378b292e63238c47708eaeb0408c858d5" Feb 24 03:14:29 crc kubenswrapper[4923]: E0224 03:14:29.143968 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d54e4d45fdd551ab1f8cd4ede765a1378b292e63238c47708eaeb0408c858d5\": container with ID starting with 4d54e4d45fdd551ab1f8cd4ede765a1378b292e63238c47708eaeb0408c858d5 not found: ID does not exist" containerID="4d54e4d45fdd551ab1f8cd4ede765a1378b292e63238c47708eaeb0408c858d5" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.144037 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d54e4d45fdd551ab1f8cd4ede765a1378b292e63238c47708eaeb0408c858d5"} err="failed to get container status \"4d54e4d45fdd551ab1f8cd4ede765a1378b292e63238c47708eaeb0408c858d5\": rpc error: code = NotFound desc = could not find container \"4d54e4d45fdd551ab1f8cd4ede765a1378b292e63238c47708eaeb0408c858d5\": container with ID starting with 4d54e4d45fdd551ab1f8cd4ede765a1378b292e63238c47708eaeb0408c858d5 not found: ID does not exist" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.144076 4923 scope.go:117] "RemoveContainer" containerID="f92151c65f6e3ae177bc83720eef6d047834c82f951440c8683470100531d11c" Feb 24 03:14:29 crc kubenswrapper[4923]: E0224 03:14:29.144395 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f92151c65f6e3ae177bc83720eef6d047834c82f951440c8683470100531d11c\": container with ID starting with f92151c65f6e3ae177bc83720eef6d047834c82f951440c8683470100531d11c not found: ID does not exist" containerID="f92151c65f6e3ae177bc83720eef6d047834c82f951440c8683470100531d11c" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.144414 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f92151c65f6e3ae177bc83720eef6d047834c82f951440c8683470100531d11c"} err="failed to get container status \"f92151c65f6e3ae177bc83720eef6d047834c82f951440c8683470100531d11c\": rpc error: code = NotFound desc = could not find container \"f92151c65f6e3ae177bc83720eef6d047834c82f951440c8683470100531d11c\": container with ID starting with f92151c65f6e3ae177bc83720eef6d047834c82f951440c8683470100531d11c not found: ID does not exist" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.144438 4923 scope.go:117] "RemoveContainer" containerID="7c43264b000e6ef530aeb4e5ceb9a9aced5e063ba1332c3b65a39bbf4374cb4b" Feb 24 03:14:29 crc kubenswrapper[4923]: E0224 03:14:29.146177 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c43264b000e6ef530aeb4e5ceb9a9aced5e063ba1332c3b65a39bbf4374cb4b\": container with ID starting with 7c43264b000e6ef530aeb4e5ceb9a9aced5e063ba1332c3b65a39bbf4374cb4b not found: ID does not exist" containerID="7c43264b000e6ef530aeb4e5ceb9a9aced5e063ba1332c3b65a39bbf4374cb4b" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.146197 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c43264b000e6ef530aeb4e5ceb9a9aced5e063ba1332c3b65a39bbf4374cb4b"} err="failed to get container status \"7c43264b000e6ef530aeb4e5ceb9a9aced5e063ba1332c3b65a39bbf4374cb4b\": rpc error: code = NotFound desc = could not find container \"7c43264b000e6ef530aeb4e5ceb9a9aced5e063ba1332c3b65a39bbf4374cb4b\": container with ID starting with 7c43264b000e6ef530aeb4e5ceb9a9aced5e063ba1332c3b65a39bbf4374cb4b not found: ID does not exist" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.146209 4923 scope.go:117] "RemoveContainer" containerID="fefc683f6457e540773eb465f3220be8bb8e47f82ae4266149fcd5a78cf8aa6f" Feb 24 03:14:29 crc kubenswrapper[4923]: E0224 03:14:29.146519 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fefc683f6457e540773eb465f3220be8bb8e47f82ae4266149fcd5a78cf8aa6f\": container with ID starting with fefc683f6457e540773eb465f3220be8bb8e47f82ae4266149fcd5a78cf8aa6f not found: ID does not exist" containerID="fefc683f6457e540773eb465f3220be8bb8e47f82ae4266149fcd5a78cf8aa6f" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.146551 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fefc683f6457e540773eb465f3220be8bb8e47f82ae4266149fcd5a78cf8aa6f"} err="failed to get container status \"fefc683f6457e540773eb465f3220be8bb8e47f82ae4266149fcd5a78cf8aa6f\": rpc error: code = NotFound desc = could not find container \"fefc683f6457e540773eb465f3220be8bb8e47f82ae4266149fcd5a78cf8aa6f\": container with ID starting with fefc683f6457e540773eb465f3220be8bb8e47f82ae4266149fcd5a78cf8aa6f not found: ID does not exist" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.169072 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caaef22c-e9f4-4ca0-b534-943351985382-log-httpd\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.169123 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-config-data\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.169144 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caaef22c-e9f4-4ca0-b534-943351985382-run-httpd\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.169284 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.169340 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-scripts\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.169366 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.169403 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4clc\" (UniqueName: \"kubernetes.io/projected/caaef22c-e9f4-4ca0-b534-943351985382-kube-api-access-g4clc\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.271327 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.271407 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-scripts\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.271433 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.271472 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4clc\" (UniqueName: \"kubernetes.io/projected/caaef22c-e9f4-4ca0-b534-943351985382-kube-api-access-g4clc\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.271505 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caaef22c-e9f4-4ca0-b534-943351985382-log-httpd\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.271529 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-config-data\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.271542 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caaef22c-e9f4-4ca0-b534-943351985382-run-httpd\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.272890 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caaef22c-e9f4-4ca0-b534-943351985382-log-httpd\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.273078 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caaef22c-e9f4-4ca0-b534-943351985382-run-httpd\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.274670 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.275584 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-config-data\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.276497 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-scripts\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.279129 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.288674 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4clc\" (UniqueName: \"kubernetes.io/projected/caaef22c-e9f4-4ca0-b534-943351985382-kube-api-access-g4clc\") pod \"ceilometer-0\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.368560 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.732106 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efcc7805-95e8-4625-b5f9-a1875fbba24d" path="/var/lib/kubelet/pods/efcc7805-95e8-4625-b5f9-a1875fbba24d/volumes" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.943459 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:14:29 crc kubenswrapper[4923]: I0224 03:14:29.958095 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.007409 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caaef22c-e9f4-4ca0-b534-943351985382","Type":"ContainerStarted","Data":"c4d46c6414b78f5124802e1cb28f857dcea7ea733a1b8de1357a4c4bf5610565"} Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.012483 4923 generic.go:334] "Generic (PLEG): container finished" podID="260b26fd-552c-4dbb-b181-d423dbd57de2" containerID="56c679cc8bc9a394faf66c327b9d9daede87630b6ecdfd96142d5fbe328abf7f" exitCode=137 Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.012569 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb8677dd-w8wrp" event={"ID":"260b26fd-552c-4dbb-b181-d423dbd57de2","Type":"ContainerDied","Data":"56c679cc8bc9a394faf66c327b9d9daede87630b6ecdfd96142d5fbe328abf7f"} Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.012585 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7fb8677dd-w8wrp" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.012601 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7fb8677dd-w8wrp" event={"ID":"260b26fd-552c-4dbb-b181-d423dbd57de2","Type":"ContainerDied","Data":"2bf5def33882cf573f78153ea23649dc9d2a81017a1bba44a1fe03a91b9eba3d"} Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.012622 4923 scope.go:117] "RemoveContainer" containerID="eb6dc2bd50631ac3c6bcb24200408fc8c381aff3618506834cab3ba070373112" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.086749 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/260b26fd-552c-4dbb-b181-d423dbd57de2-logs\") pod \"260b26fd-552c-4dbb-b181-d423dbd57de2\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.086826 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260b26fd-552c-4dbb-b181-d423dbd57de2-combined-ca-bundle\") pod \"260b26fd-552c-4dbb-b181-d423dbd57de2\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.086884 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/260b26fd-552c-4dbb-b181-d423dbd57de2-horizon-tls-certs\") pod \"260b26fd-552c-4dbb-b181-d423dbd57de2\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.086922 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/260b26fd-552c-4dbb-b181-d423dbd57de2-config-data\") pod \"260b26fd-552c-4dbb-b181-d423dbd57de2\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.086960 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6whmk\" (UniqueName: \"kubernetes.io/projected/260b26fd-552c-4dbb-b181-d423dbd57de2-kube-api-access-6whmk\") pod \"260b26fd-552c-4dbb-b181-d423dbd57de2\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.086978 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/260b26fd-552c-4dbb-b181-d423dbd57de2-horizon-secret-key\") pod \"260b26fd-552c-4dbb-b181-d423dbd57de2\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.087007 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/260b26fd-552c-4dbb-b181-d423dbd57de2-scripts\") pod \"260b26fd-552c-4dbb-b181-d423dbd57de2\" (UID: \"260b26fd-552c-4dbb-b181-d423dbd57de2\") " Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.088278 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/260b26fd-552c-4dbb-b181-d423dbd57de2-logs" (OuterVolumeSpecName: "logs") pod "260b26fd-552c-4dbb-b181-d423dbd57de2" (UID: "260b26fd-552c-4dbb-b181-d423dbd57de2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.093051 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/260b26fd-552c-4dbb-b181-d423dbd57de2-kube-api-access-6whmk" (OuterVolumeSpecName: "kube-api-access-6whmk") pod "260b26fd-552c-4dbb-b181-d423dbd57de2" (UID: "260b26fd-552c-4dbb-b181-d423dbd57de2"). InnerVolumeSpecName "kube-api-access-6whmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.094347 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/260b26fd-552c-4dbb-b181-d423dbd57de2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "260b26fd-552c-4dbb-b181-d423dbd57de2" (UID: "260b26fd-552c-4dbb-b181-d423dbd57de2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.113639 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/260b26fd-552c-4dbb-b181-d423dbd57de2-scripts" (OuterVolumeSpecName: "scripts") pod "260b26fd-552c-4dbb-b181-d423dbd57de2" (UID: "260b26fd-552c-4dbb-b181-d423dbd57de2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.115679 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/260b26fd-552c-4dbb-b181-d423dbd57de2-config-data" (OuterVolumeSpecName: "config-data") pod "260b26fd-552c-4dbb-b181-d423dbd57de2" (UID: "260b26fd-552c-4dbb-b181-d423dbd57de2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.130051 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/260b26fd-552c-4dbb-b181-d423dbd57de2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "260b26fd-552c-4dbb-b181-d423dbd57de2" (UID: "260b26fd-552c-4dbb-b181-d423dbd57de2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.144812 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/260b26fd-552c-4dbb-b181-d423dbd57de2-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "260b26fd-552c-4dbb-b181-d423dbd57de2" (UID: "260b26fd-552c-4dbb-b181-d423dbd57de2"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.180982 4923 scope.go:117] "RemoveContainer" containerID="56c679cc8bc9a394faf66c327b9d9daede87630b6ecdfd96142d5fbe328abf7f" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.189170 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/260b26fd-552c-4dbb-b181-d423dbd57de2-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.189202 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6whmk\" (UniqueName: \"kubernetes.io/projected/260b26fd-552c-4dbb-b181-d423dbd57de2-kube-api-access-6whmk\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.189212 4923 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/260b26fd-552c-4dbb-b181-d423dbd57de2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.189222 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/260b26fd-552c-4dbb-b181-d423dbd57de2-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.189230 4923 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/260b26fd-552c-4dbb-b181-d423dbd57de2-logs\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.189238 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260b26fd-552c-4dbb-b181-d423dbd57de2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.189248 4923 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/260b26fd-552c-4dbb-b181-d423dbd57de2-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.215279 4923 scope.go:117] "RemoveContainer" containerID="eb6dc2bd50631ac3c6bcb24200408fc8c381aff3618506834cab3ba070373112" Feb 24 03:14:30 crc kubenswrapper[4923]: E0224 03:14:30.215902 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb6dc2bd50631ac3c6bcb24200408fc8c381aff3618506834cab3ba070373112\": container with ID starting with eb6dc2bd50631ac3c6bcb24200408fc8c381aff3618506834cab3ba070373112 not found: ID does not exist" containerID="eb6dc2bd50631ac3c6bcb24200408fc8c381aff3618506834cab3ba070373112" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.215950 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb6dc2bd50631ac3c6bcb24200408fc8c381aff3618506834cab3ba070373112"} err="failed to get container status \"eb6dc2bd50631ac3c6bcb24200408fc8c381aff3618506834cab3ba070373112\": rpc error: code = NotFound desc = could not find container \"eb6dc2bd50631ac3c6bcb24200408fc8c381aff3618506834cab3ba070373112\": container with ID starting with eb6dc2bd50631ac3c6bcb24200408fc8c381aff3618506834cab3ba070373112 not found: ID does not exist" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.215983 4923 scope.go:117] "RemoveContainer" containerID="56c679cc8bc9a394faf66c327b9d9daede87630b6ecdfd96142d5fbe328abf7f" Feb 24 03:14:30 crc kubenswrapper[4923]: E0224 03:14:30.216742 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56c679cc8bc9a394faf66c327b9d9daede87630b6ecdfd96142d5fbe328abf7f\": container with ID starting with 56c679cc8bc9a394faf66c327b9d9daede87630b6ecdfd96142d5fbe328abf7f not found: ID does not exist" containerID="56c679cc8bc9a394faf66c327b9d9daede87630b6ecdfd96142d5fbe328abf7f" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.216769 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c679cc8bc9a394faf66c327b9d9daede87630b6ecdfd96142d5fbe328abf7f"} err="failed to get container status \"56c679cc8bc9a394faf66c327b9d9daede87630b6ecdfd96142d5fbe328abf7f\": rpc error: code = NotFound desc = could not find container \"56c679cc8bc9a394faf66c327b9d9daede87630b6ecdfd96142d5fbe328abf7f\": container with ID starting with 56c679cc8bc9a394faf66c327b9d9daede87630b6ecdfd96142d5fbe328abf7f not found: ID does not exist" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.347924 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7fb8677dd-w8wrp"] Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.356855 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7fb8677dd-w8wrp"] Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.400316 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-645cdc8bdf-bkt49" Feb 24 03:14:30 crc kubenswrapper[4923]: I0224 03:14:30.539222 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.031707 4923 generic.go:334] "Generic (PLEG): container finished" podID="061ab736-e68d-4053-b8d3-13ab8220ef22" containerID="b34d0a1c313990b17099067d7d4399a6487d36644a99fc842feb91da7815714f" exitCode=0 Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.031778 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57f869f9f6-f2wpq" event={"ID":"061ab736-e68d-4053-b8d3-13ab8220ef22","Type":"ContainerDied","Data":"b34d0a1c313990b17099067d7d4399a6487d36644a99fc842feb91da7815714f"} Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.706119 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.726314 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="260b26fd-552c-4dbb-b181-d423dbd57de2" path="/var/lib/kubelet/pods/260b26fd-552c-4dbb-b181-d423dbd57de2/volumes" Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.856524 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-httpd-config\") pod \"061ab736-e68d-4053-b8d3-13ab8220ef22\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.856613 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jjws\" (UniqueName: \"kubernetes.io/projected/061ab736-e68d-4053-b8d3-13ab8220ef22-kube-api-access-2jjws\") pod \"061ab736-e68d-4053-b8d3-13ab8220ef22\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.856642 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-config\") pod \"061ab736-e68d-4053-b8d3-13ab8220ef22\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.856784 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-ovndb-tls-certs\") pod \"061ab736-e68d-4053-b8d3-13ab8220ef22\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.856810 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-combined-ca-bundle\") pod \"061ab736-e68d-4053-b8d3-13ab8220ef22\" (UID: \"061ab736-e68d-4053-b8d3-13ab8220ef22\") " Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.865864 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "061ab736-e68d-4053-b8d3-13ab8220ef22" (UID: "061ab736-e68d-4053-b8d3-13ab8220ef22"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.866213 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/061ab736-e68d-4053-b8d3-13ab8220ef22-kube-api-access-2jjws" (OuterVolumeSpecName: "kube-api-access-2jjws") pod "061ab736-e68d-4053-b8d3-13ab8220ef22" (UID: "061ab736-e68d-4053-b8d3-13ab8220ef22"). InnerVolumeSpecName "kube-api-access-2jjws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.932649 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "061ab736-e68d-4053-b8d3-13ab8220ef22" (UID: "061ab736-e68d-4053-b8d3-13ab8220ef22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.933222 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-config" (OuterVolumeSpecName: "config") pod "061ab736-e68d-4053-b8d3-13ab8220ef22" (UID: "061ab736-e68d-4053-b8d3-13ab8220ef22"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.949718 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "061ab736-e68d-4053-b8d3-13ab8220ef22" (UID: "061ab736-e68d-4053-b8d3-13ab8220ef22"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.962470 4923 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.962504 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.962514 4923 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.962525 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jjws\" (UniqueName: \"kubernetes.io/projected/061ab736-e68d-4053-b8d3-13ab8220ef22-kube-api-access-2jjws\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:31 crc kubenswrapper[4923]: I0224 03:14:31.962536 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/061ab736-e68d-4053-b8d3-13ab8220ef22-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:32 crc kubenswrapper[4923]: I0224 03:14:32.045520 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caaef22c-e9f4-4ca0-b534-943351985382","Type":"ContainerStarted","Data":"7ed4bef870cd5591742472697f31998e6d865721af6dd8c097c21e7145f19158"} Feb 24 03:14:32 crc kubenswrapper[4923]: I0224 03:14:32.045559 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caaef22c-e9f4-4ca0-b534-943351985382","Type":"ContainerStarted","Data":"9aa9a41987a9cbc2a537716b3c3c740914d8cdb2d920f66f66248b16d5c304cc"} Feb 24 03:14:32 crc kubenswrapper[4923]: I0224 03:14:32.047322 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57f869f9f6-f2wpq" event={"ID":"061ab736-e68d-4053-b8d3-13ab8220ef22","Type":"ContainerDied","Data":"1e9250594c86112449a3035e9dc00e0dc416ccfc767d46e1a4a5837214a15e0f"} Feb 24 03:14:32 crc kubenswrapper[4923]: I0224 03:14:32.047347 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57f869f9f6-f2wpq" Feb 24 03:14:32 crc kubenswrapper[4923]: I0224 03:14:32.047375 4923 scope.go:117] "RemoveContainer" containerID="be8f1d820b924b4e30acc346afdbdc68223f3cc44f3913c62024f7379f36d1de" Feb 24 03:14:32 crc kubenswrapper[4923]: I0224 03:14:32.068995 4923 scope.go:117] "RemoveContainer" containerID="b34d0a1c313990b17099067d7d4399a6487d36644a99fc842feb91da7815714f" Feb 24 03:14:32 crc kubenswrapper[4923]: I0224 03:14:32.078789 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-57f869f9f6-f2wpq"] Feb 24 03:14:32 crc kubenswrapper[4923]: I0224 03:14:32.086266 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-57f869f9f6-f2wpq"] Feb 24 03:14:33 crc kubenswrapper[4923]: I0224 03:14:33.063602 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caaef22c-e9f4-4ca0-b534-943351985382","Type":"ContainerStarted","Data":"7a139f047a5ae84a7fc57e83cc3565a1ca4c169a2426e55da5e73a4d29cd4dc1"} Feb 24 03:14:33 crc kubenswrapper[4923]: I0224 03:14:33.235312 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:33 crc kubenswrapper[4923]: I0224 03:14:33.442588 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-9d7999766-h8pkz" Feb 24 03:14:33 crc kubenswrapper[4923]: I0224 03:14:33.510434 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-557784489b-tj6xd"] Feb 24 03:14:33 crc kubenswrapper[4923]: I0224 03:14:33.511138 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-557784489b-tj6xd" podUID="fe80dcd0-03dd-4361-a639-a995c6f55a08" containerName="placement-log" containerID="cri-o://f47b21d2eb78e0fa5b014dd1fdfcae3eb1646e6a6d51602bfe879c63435b7a7b" gracePeriod=30 Feb 24 03:14:33 crc kubenswrapper[4923]: I0224 03:14:33.513387 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-557784489b-tj6xd" podUID="fe80dcd0-03dd-4361-a639-a995c6f55a08" containerName="placement-api" containerID="cri-o://656653f429ff4f783e8d509dcb9631e292a9daea43b820b356ac16a1b4157b00" gracePeriod=30 Feb 24 03:14:33 crc kubenswrapper[4923]: I0224 03:14:33.725630 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="061ab736-e68d-4053-b8d3-13ab8220ef22" path="/var/lib/kubelet/pods/061ab736-e68d-4053-b8d3-13ab8220ef22/volumes" Feb 24 03:14:34 crc kubenswrapper[4923]: I0224 03:14:34.077759 4923 generic.go:334] "Generic (PLEG): container finished" podID="fe80dcd0-03dd-4361-a639-a995c6f55a08" containerID="f47b21d2eb78e0fa5b014dd1fdfcae3eb1646e6a6d51602bfe879c63435b7a7b" exitCode=143 Feb 24 03:14:34 crc kubenswrapper[4923]: I0224 03:14:34.077837 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-557784489b-tj6xd" event={"ID":"fe80dcd0-03dd-4361-a639-a995c6f55a08","Type":"ContainerDied","Data":"f47b21d2eb78e0fa5b014dd1fdfcae3eb1646e6a6d51602bfe879c63435b7a7b"} Feb 24 03:14:37 crc kubenswrapper[4923]: I0224 03:14:37.043209 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 24 03:14:37 crc kubenswrapper[4923]: I0224 03:14:37.044027 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c8517902-92e0-4ee1-8765-9d7331ac90f4" containerName="glance-log" containerID="cri-o://e7089037826ac92663662aa816a1ff80b3e12c7126fa78b451796cd8ed112700" gracePeriod=30 Feb 24 03:14:37 crc kubenswrapper[4923]: I0224 03:14:37.044105 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c8517902-92e0-4ee1-8765-9d7331ac90f4" containerName="glance-httpd" containerID="cri-o://6d8f953d837de4508355c3dafe77323b8097ca5c24c298a772eaf1d7c0cf69d4" gracePeriod=30 Feb 24 03:14:37 crc kubenswrapper[4923]: I0224 03:14:37.105697 4923 generic.go:334] "Generic (PLEG): container finished" podID="fe80dcd0-03dd-4361-a639-a995c6f55a08" containerID="656653f429ff4f783e8d509dcb9631e292a9daea43b820b356ac16a1b4157b00" exitCode=0 Feb 24 03:14:37 crc kubenswrapper[4923]: I0224 03:14:37.105736 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-557784489b-tj6xd" event={"ID":"fe80dcd0-03dd-4361-a639-a995c6f55a08","Type":"ContainerDied","Data":"656653f429ff4f783e8d509dcb9631e292a9daea43b820b356ac16a1b4157b00"} Feb 24 03:14:38 crc kubenswrapper[4923]: I0224 03:14:38.108028 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 24 03:14:38 crc kubenswrapper[4923]: I0224 03:14:38.108763 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cb08edd9-6041-489a-8713-8bc00d88527c" containerName="glance-httpd" containerID="cri-o://f93d040957448cccd4cc84ffac2372edd5288de98958e37af6902ced0e9393f4" gracePeriod=30 Feb 24 03:14:38 crc kubenswrapper[4923]: I0224 03:14:38.108459 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cb08edd9-6041-489a-8713-8bc00d88527c" containerName="glance-log" containerID="cri-o://8fccc6770b4bd1f1d31a27de5050d7a38775f44d0e2e5ade0fb320a0addbdf11" gracePeriod=30 Feb 24 03:14:38 crc kubenswrapper[4923]: I0224 03:14:38.125026 4923 generic.go:334] "Generic (PLEG): container finished" podID="c8517902-92e0-4ee1-8765-9d7331ac90f4" containerID="e7089037826ac92663662aa816a1ff80b3e12c7126fa78b451796cd8ed112700" exitCode=143 Feb 24 03:14:38 crc kubenswrapper[4923]: I0224 03:14:38.125150 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c8517902-92e0-4ee1-8765-9d7331ac90f4","Type":"ContainerDied","Data":"e7089037826ac92663662aa816a1ff80b3e12c7126fa78b451796cd8ed112700"} Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.095817 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-557784489b-tj6xd" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.147899 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-557784489b-tj6xd" event={"ID":"fe80dcd0-03dd-4361-a639-a995c6f55a08","Type":"ContainerDied","Data":"86cbe5e4ae62d6c0073711e0a19cf065a8364bed46ebb210fce3f08f29a7a604"} Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.148357 4923 scope.go:117] "RemoveContainer" containerID="656653f429ff4f783e8d509dcb9631e292a9daea43b820b356ac16a1b4157b00" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.148578 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-557784489b-tj6xd" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.163384 4923 generic.go:334] "Generic (PLEG): container finished" podID="cb08edd9-6041-489a-8713-8bc00d88527c" containerID="8fccc6770b4bd1f1d31a27de5050d7a38775f44d0e2e5ade0fb320a0addbdf11" exitCode=143 Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.163510 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb08edd9-6041-489a-8713-8bc00d88527c","Type":"ContainerDied","Data":"8fccc6770b4bd1f1d31a27de5050d7a38775f44d0e2e5ade0fb320a0addbdf11"} Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.172746 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qlhvh" event={"ID":"a5bfccf6-bf49-4a31-9367-afe9f29cbf74","Type":"ContainerStarted","Data":"a6d81afc0f1ea5db3c8e12dd2b3757d1cf37f7e3b919a57b2128d42f99e88eba"} Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.196968 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxnxq\" (UniqueName: \"kubernetes.io/projected/fe80dcd0-03dd-4361-a639-a995c6f55a08-kube-api-access-lxnxq\") pod \"fe80dcd0-03dd-4361-a639-a995c6f55a08\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.197105 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe80dcd0-03dd-4361-a639-a995c6f55a08-logs\") pod \"fe80dcd0-03dd-4361-a639-a995c6f55a08\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.197176 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-internal-tls-certs\") pod \"fe80dcd0-03dd-4361-a639-a995c6f55a08\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.197211 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-combined-ca-bundle\") pod \"fe80dcd0-03dd-4361-a639-a995c6f55a08\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.197263 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-public-tls-certs\") pod \"fe80dcd0-03dd-4361-a639-a995c6f55a08\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.197335 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-scripts\") pod \"fe80dcd0-03dd-4361-a639-a995c6f55a08\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.197352 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-config-data\") pod \"fe80dcd0-03dd-4361-a639-a995c6f55a08\" (UID: \"fe80dcd0-03dd-4361-a639-a995c6f55a08\") " Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.201317 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe80dcd0-03dd-4361-a639-a995c6f55a08-logs" (OuterVolumeSpecName: "logs") pod "fe80dcd0-03dd-4361-a639-a995c6f55a08" (UID: "fe80dcd0-03dd-4361-a639-a995c6f55a08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.207920 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe80dcd0-03dd-4361-a639-a995c6f55a08-kube-api-access-lxnxq" (OuterVolumeSpecName: "kube-api-access-lxnxq") pod "fe80dcd0-03dd-4361-a639-a995c6f55a08" (UID: "fe80dcd0-03dd-4361-a639-a995c6f55a08"). InnerVolumeSpecName "kube-api-access-lxnxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.213640 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-scripts" (OuterVolumeSpecName: "scripts") pod "fe80dcd0-03dd-4361-a639-a995c6f55a08" (UID: "fe80dcd0-03dd-4361-a639-a995c6f55a08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.218523 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-qlhvh" podStartSLOduration=1.598226118 podStartE2EDuration="12.218503071s" podCreationTimestamp="2026-02-24 03:14:27 +0000 UTC" firstStartedPulling="2026-02-24 03:14:28.251129428 +0000 UTC m=+1192.268200241" lastFinishedPulling="2026-02-24 03:14:38.871406381 +0000 UTC m=+1202.888477194" observedRunningTime="2026-02-24 03:14:39.203209942 +0000 UTC m=+1203.220280755" watchObservedRunningTime="2026-02-24 03:14:39.218503071 +0000 UTC m=+1203.235573884" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.230152 4923 scope.go:117] "RemoveContainer" containerID="f47b21d2eb78e0fa5b014dd1fdfcae3eb1646e6a6d51602bfe879c63435b7a7b" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.279002 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe80dcd0-03dd-4361-a639-a995c6f55a08" (UID: "fe80dcd0-03dd-4361-a639-a995c6f55a08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.280350 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-config-data" (OuterVolumeSpecName: "config-data") pod "fe80dcd0-03dd-4361-a639-a995c6f55a08" (UID: "fe80dcd0-03dd-4361-a639-a995c6f55a08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.299505 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxnxq\" (UniqueName: \"kubernetes.io/projected/fe80dcd0-03dd-4361-a639-a995c6f55a08-kube-api-access-lxnxq\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.299536 4923 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe80dcd0-03dd-4361-a639-a995c6f55a08-logs\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.299548 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.299557 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.299565 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.330568 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fe80dcd0-03dd-4361-a639-a995c6f55a08" (UID: "fe80dcd0-03dd-4361-a639-a995c6f55a08"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.336556 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fe80dcd0-03dd-4361-a639-a995c6f55a08" (UID: "fe80dcd0-03dd-4361-a639-a995c6f55a08"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.401650 4923 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.401691 4923 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe80dcd0-03dd-4361-a639-a995c6f55a08-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.479437 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-557784489b-tj6xd"] Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.486357 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-557784489b-tj6xd"] Feb 24 03:14:39 crc kubenswrapper[4923]: I0224 03:14:39.730521 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe80dcd0-03dd-4361-a639-a995c6f55a08" path="/var/lib/kubelet/pods/fe80dcd0-03dd-4361-a639-a995c6f55a08/volumes" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.222545 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caaef22c-e9f4-4ca0-b534-943351985382","Type":"ContainerStarted","Data":"4f86b155c5991fc15d0653acee0a90987cb67855c3ee8d6fe214884b19dd7d58"} Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.223204 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="caaef22c-e9f4-4ca0-b534-943351985382" containerName="ceilometer-central-agent" containerID="cri-o://9aa9a41987a9cbc2a537716b3c3c740914d8cdb2d920f66f66248b16d5c304cc" gracePeriod=30 Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.223479 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.223542 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="caaef22c-e9f4-4ca0-b534-943351985382" containerName="proxy-httpd" containerID="cri-o://4f86b155c5991fc15d0653acee0a90987cb67855c3ee8d6fe214884b19dd7d58" gracePeriod=30 Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.223577 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="caaef22c-e9f4-4ca0-b534-943351985382" containerName="ceilometer-notification-agent" containerID="cri-o://7ed4bef870cd5591742472697f31998e6d865721af6dd8c097c21e7145f19158" gracePeriod=30 Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.223668 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="caaef22c-e9f4-4ca0-b534-943351985382" containerName="sg-core" containerID="cri-o://7a139f047a5ae84a7fc57e83cc3565a1ca4c169a2426e55da5e73a4d29cd4dc1" gracePeriod=30 Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.256078 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.3743124780000002 podStartE2EDuration="11.256052967s" podCreationTimestamp="2026-02-24 03:14:29 +0000 UTC" firstStartedPulling="2026-02-24 03:14:29.981895419 +0000 UTC m=+1193.998966222" lastFinishedPulling="2026-02-24 03:14:38.863635898 +0000 UTC m=+1202.880706711" observedRunningTime="2026-02-24 03:14:40.249584558 +0000 UTC m=+1204.266655371" watchObservedRunningTime="2026-02-24 03:14:40.256052967 +0000 UTC m=+1204.273123780" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.682240 4923 scope.go:117] "RemoveContainer" containerID="5131905ec70ffd9c6835477b1371472eccad685adff6a608aaf8d5a993bba213" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.689785 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.831020 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-public-tls-certs\") pod \"c8517902-92e0-4ee1-8765-9d7331ac90f4\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.831083 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-combined-ca-bundle\") pod \"c8517902-92e0-4ee1-8765-9d7331ac90f4\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.831120 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-config-data\") pod \"c8517902-92e0-4ee1-8765-9d7331ac90f4\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.831152 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8517902-92e0-4ee1-8765-9d7331ac90f4-httpd-run\") pod \"c8517902-92e0-4ee1-8765-9d7331ac90f4\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.831226 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8517902-92e0-4ee1-8765-9d7331ac90f4-logs\") pod \"c8517902-92e0-4ee1-8765-9d7331ac90f4\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.831267 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"c8517902-92e0-4ee1-8765-9d7331ac90f4\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.831307 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mjds\" (UniqueName: \"kubernetes.io/projected/c8517902-92e0-4ee1-8765-9d7331ac90f4-kube-api-access-2mjds\") pod \"c8517902-92e0-4ee1-8765-9d7331ac90f4\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.831369 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-scripts\") pod \"c8517902-92e0-4ee1-8765-9d7331ac90f4\" (UID: \"c8517902-92e0-4ee1-8765-9d7331ac90f4\") " Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.831725 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8517902-92e0-4ee1-8765-9d7331ac90f4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c8517902-92e0-4ee1-8765-9d7331ac90f4" (UID: "c8517902-92e0-4ee1-8765-9d7331ac90f4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.832011 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8517902-92e0-4ee1-8765-9d7331ac90f4-logs" (OuterVolumeSpecName: "logs") pod "c8517902-92e0-4ee1-8765-9d7331ac90f4" (UID: "c8517902-92e0-4ee1-8765-9d7331ac90f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.842482 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-scripts" (OuterVolumeSpecName: "scripts") pod "c8517902-92e0-4ee1-8765-9d7331ac90f4" (UID: "c8517902-92e0-4ee1-8765-9d7331ac90f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.844224 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8517902-92e0-4ee1-8765-9d7331ac90f4-kube-api-access-2mjds" (OuterVolumeSpecName: "kube-api-access-2mjds") pod "c8517902-92e0-4ee1-8765-9d7331ac90f4" (UID: "c8517902-92e0-4ee1-8765-9d7331ac90f4"). InnerVolumeSpecName "kube-api-access-2mjds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.857403 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "c8517902-92e0-4ee1-8765-9d7331ac90f4" (UID: "c8517902-92e0-4ee1-8765-9d7331ac90f4"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.898440 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8517902-92e0-4ee1-8765-9d7331ac90f4" (UID: "c8517902-92e0-4ee1-8765-9d7331ac90f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.898978 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c8517902-92e0-4ee1-8765-9d7331ac90f4" (UID: "c8517902-92e0-4ee1-8765-9d7331ac90f4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.934172 4923 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.934206 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mjds\" (UniqueName: \"kubernetes.io/projected/c8517902-92e0-4ee1-8765-9d7331ac90f4-kube-api-access-2mjds\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.934216 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.934227 4923 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.934237 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.934246 4923 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8517902-92e0-4ee1-8765-9d7331ac90f4-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.934254 4923 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8517902-92e0-4ee1-8765-9d7331ac90f4-logs\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.964425 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-config-data" (OuterVolumeSpecName: "config-data") pod "c8517902-92e0-4ee1-8765-9d7331ac90f4" (UID: "c8517902-92e0-4ee1-8765-9d7331ac90f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:40 crc kubenswrapper[4923]: I0224 03:14:40.965553 4923 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.035850 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8517902-92e0-4ee1-8765-9d7331ac90f4-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.035890 4923 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.241446 4923 generic.go:334] "Generic (PLEG): container finished" podID="caaef22c-e9f4-4ca0-b534-943351985382" containerID="4f86b155c5991fc15d0653acee0a90987cb67855c3ee8d6fe214884b19dd7d58" exitCode=0 Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.242022 4923 generic.go:334] "Generic (PLEG): container finished" podID="caaef22c-e9f4-4ca0-b534-943351985382" containerID="7a139f047a5ae84a7fc57e83cc3565a1ca4c169a2426e55da5e73a4d29cd4dc1" exitCode=2 Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.241541 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caaef22c-e9f4-4ca0-b534-943351985382","Type":"ContainerDied","Data":"4f86b155c5991fc15d0653acee0a90987cb67855c3ee8d6fe214884b19dd7d58"} Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.242071 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caaef22c-e9f4-4ca0-b534-943351985382","Type":"ContainerDied","Data":"7a139f047a5ae84a7fc57e83cc3565a1ca4c169a2426e55da5e73a4d29cd4dc1"} Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.242036 4923 generic.go:334] "Generic (PLEG): container finished" podID="caaef22c-e9f4-4ca0-b534-943351985382" containerID="7ed4bef870cd5591742472697f31998e6d865721af6dd8c097c21e7145f19158" exitCode=0 Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.242092 4923 generic.go:334] "Generic (PLEG): container finished" podID="caaef22c-e9f4-4ca0-b534-943351985382" containerID="9aa9a41987a9cbc2a537716b3c3c740914d8cdb2d920f66f66248b16d5c304cc" exitCode=0 Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.242097 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caaef22c-e9f4-4ca0-b534-943351985382","Type":"ContainerDied","Data":"7ed4bef870cd5591742472697f31998e6d865721af6dd8c097c21e7145f19158"} Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.242109 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caaef22c-e9f4-4ca0-b534-943351985382","Type":"ContainerDied","Data":"9aa9a41987a9cbc2a537716b3c3c740914d8cdb2d920f66f66248b16d5c304cc"} Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.242120 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caaef22c-e9f4-4ca0-b534-943351985382","Type":"ContainerDied","Data":"c4d46c6414b78f5124802e1cb28f857dcea7ea733a1b8de1357a4c4bf5610565"} Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.242129 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4d46c6414b78f5124802e1cb28f857dcea7ea733a1b8de1357a4c4bf5610565" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.243680 4923 generic.go:334] "Generic (PLEG): container finished" podID="c8517902-92e0-4ee1-8765-9d7331ac90f4" containerID="6d8f953d837de4508355c3dafe77323b8097ca5c24c298a772eaf1d7c0cf69d4" exitCode=0 Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.243711 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c8517902-92e0-4ee1-8765-9d7331ac90f4","Type":"ContainerDied","Data":"6d8f953d837de4508355c3dafe77323b8097ca5c24c298a772eaf1d7c0cf69d4"} Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.243728 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c8517902-92e0-4ee1-8765-9d7331ac90f4","Type":"ContainerDied","Data":"9eca302f370a273f2dafdee277198a33467f2277ef7e66be7ff4825d38e07c2b"} Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.243747 4923 scope.go:117] "RemoveContainer" containerID="6d8f953d837de4508355c3dafe77323b8097ca5c24c298a772eaf1d7c0cf69d4" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.243867 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.289753 4923 scope.go:117] "RemoveContainer" containerID="e7089037826ac92663662aa816a1ff80b3e12c7126fa78b451796cd8ed112700" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.292914 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.309843 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.322984 4923 scope.go:117] "RemoveContainer" containerID="6d8f953d837de4508355c3dafe77323b8097ca5c24c298a772eaf1d7c0cf69d4" Feb 24 03:14:41 crc kubenswrapper[4923]: E0224 03:14:41.323499 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d8f953d837de4508355c3dafe77323b8097ca5c24c298a772eaf1d7c0cf69d4\": container with ID starting with 6d8f953d837de4508355c3dafe77323b8097ca5c24c298a772eaf1d7c0cf69d4 not found: ID does not exist" containerID="6d8f953d837de4508355c3dafe77323b8097ca5c24c298a772eaf1d7c0cf69d4" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.323531 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d8f953d837de4508355c3dafe77323b8097ca5c24c298a772eaf1d7c0cf69d4"} err="failed to get container status \"6d8f953d837de4508355c3dafe77323b8097ca5c24c298a772eaf1d7c0cf69d4\": rpc error: code = NotFound desc = could not find container \"6d8f953d837de4508355c3dafe77323b8097ca5c24c298a772eaf1d7c0cf69d4\": container with ID starting with 6d8f953d837de4508355c3dafe77323b8097ca5c24c298a772eaf1d7c0cf69d4 not found: ID does not exist" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.323560 4923 scope.go:117] "RemoveContainer" containerID="e7089037826ac92663662aa816a1ff80b3e12c7126fa78b451796cd8ed112700" Feb 24 03:14:41 crc kubenswrapper[4923]: E0224 03:14:41.323862 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7089037826ac92663662aa816a1ff80b3e12c7126fa78b451796cd8ed112700\": container with ID starting with e7089037826ac92663662aa816a1ff80b3e12c7126fa78b451796cd8ed112700 not found: ID does not exist" containerID="e7089037826ac92663662aa816a1ff80b3e12c7126fa78b451796cd8ed112700" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.323887 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7089037826ac92663662aa816a1ff80b3e12c7126fa78b451796cd8ed112700"} err="failed to get container status \"e7089037826ac92663662aa816a1ff80b3e12c7126fa78b451796cd8ed112700\": rpc error: code = NotFound desc = could not find container \"e7089037826ac92663662aa816a1ff80b3e12c7126fa78b451796cd8ed112700\": container with ID starting with e7089037826ac92663662aa816a1ff80b3e12c7126fa78b451796cd8ed112700 not found: ID does not exist" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.332924 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.340681 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 24 03:14:41 crc kubenswrapper[4923]: E0224 03:14:41.341147 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="260b26fd-552c-4dbb-b181-d423dbd57de2" containerName="horizon" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341168 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="260b26fd-552c-4dbb-b181-d423dbd57de2" containerName="horizon" Feb 24 03:14:41 crc kubenswrapper[4923]: E0224 03:14:41.341180 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8517902-92e0-4ee1-8765-9d7331ac90f4" containerName="glance-httpd" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341189 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8517902-92e0-4ee1-8765-9d7331ac90f4" containerName="glance-httpd" Feb 24 03:14:41 crc kubenswrapper[4923]: E0224 03:14:41.341203 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe80dcd0-03dd-4361-a639-a995c6f55a08" containerName="placement-api" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341210 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe80dcd0-03dd-4361-a639-a995c6f55a08" containerName="placement-api" Feb 24 03:14:41 crc kubenswrapper[4923]: E0224 03:14:41.341233 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe80dcd0-03dd-4361-a639-a995c6f55a08" containerName="placement-log" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341240 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe80dcd0-03dd-4361-a639-a995c6f55a08" containerName="placement-log" Feb 24 03:14:41 crc kubenswrapper[4923]: E0224 03:14:41.341254 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061ab736-e68d-4053-b8d3-13ab8220ef22" containerName="neutron-api" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341261 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="061ab736-e68d-4053-b8d3-13ab8220ef22" containerName="neutron-api" Feb 24 03:14:41 crc kubenswrapper[4923]: E0224 03:14:41.341276 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061ab736-e68d-4053-b8d3-13ab8220ef22" containerName="neutron-httpd" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341283 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="061ab736-e68d-4053-b8d3-13ab8220ef22" containerName="neutron-httpd" Feb 24 03:14:41 crc kubenswrapper[4923]: E0224 03:14:41.341317 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caaef22c-e9f4-4ca0-b534-943351985382" containerName="proxy-httpd" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341326 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="caaef22c-e9f4-4ca0-b534-943351985382" containerName="proxy-httpd" Feb 24 03:14:41 crc kubenswrapper[4923]: E0224 03:14:41.341335 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8517902-92e0-4ee1-8765-9d7331ac90f4" containerName="glance-log" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341342 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8517902-92e0-4ee1-8765-9d7331ac90f4" containerName="glance-log" Feb 24 03:14:41 crc kubenswrapper[4923]: E0224 03:14:41.341355 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caaef22c-e9f4-4ca0-b534-943351985382" containerName="ceilometer-notification-agent" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341362 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="caaef22c-e9f4-4ca0-b534-943351985382" containerName="ceilometer-notification-agent" Feb 24 03:14:41 crc kubenswrapper[4923]: E0224 03:14:41.341376 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caaef22c-e9f4-4ca0-b534-943351985382" containerName="ceilometer-central-agent" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341384 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="caaef22c-e9f4-4ca0-b534-943351985382" containerName="ceilometer-central-agent" Feb 24 03:14:41 crc kubenswrapper[4923]: E0224 03:14:41.341401 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="260b26fd-552c-4dbb-b181-d423dbd57de2" containerName="horizon-log" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341409 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="260b26fd-552c-4dbb-b181-d423dbd57de2" containerName="horizon-log" Feb 24 03:14:41 crc kubenswrapper[4923]: E0224 03:14:41.341420 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caaef22c-e9f4-4ca0-b534-943351985382" containerName="sg-core" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341426 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="caaef22c-e9f4-4ca0-b534-943351985382" containerName="sg-core" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341658 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="061ab736-e68d-4053-b8d3-13ab8220ef22" containerName="neutron-api" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341676 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8517902-92e0-4ee1-8765-9d7331ac90f4" containerName="glance-httpd" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341685 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="caaef22c-e9f4-4ca0-b534-943351985382" containerName="ceilometer-notification-agent" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341696 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8517902-92e0-4ee1-8765-9d7331ac90f4" containerName="glance-log" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341709 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="061ab736-e68d-4053-b8d3-13ab8220ef22" containerName="neutron-httpd" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341725 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe80dcd0-03dd-4361-a639-a995c6f55a08" containerName="placement-log" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341736 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="260b26fd-552c-4dbb-b181-d423dbd57de2" containerName="horizon" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341749 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe80dcd0-03dd-4361-a639-a995c6f55a08" containerName="placement-api" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341763 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="caaef22c-e9f4-4ca0-b534-943351985382" containerName="ceilometer-central-agent" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341776 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="260b26fd-552c-4dbb-b181-d423dbd57de2" containerName="horizon-log" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341787 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="caaef22c-e9f4-4ca0-b534-943351985382" containerName="sg-core" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.341799 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="caaef22c-e9f4-4ca0-b534-943351985382" containerName="proxy-httpd" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.342920 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.348511 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.348830 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.357890 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.442104 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caaef22c-e9f4-4ca0-b534-943351985382-run-httpd\") pod \"caaef22c-e9f4-4ca0-b534-943351985382\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.442182 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-combined-ca-bundle\") pod \"caaef22c-e9f4-4ca0-b534-943351985382\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.442252 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4clc\" (UniqueName: \"kubernetes.io/projected/caaef22c-e9f4-4ca0-b534-943351985382-kube-api-access-g4clc\") pod \"caaef22c-e9f4-4ca0-b534-943351985382\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.442283 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-sg-core-conf-yaml\") pod \"caaef22c-e9f4-4ca0-b534-943351985382\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.442392 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-scripts\") pod \"caaef22c-e9f4-4ca0-b534-943351985382\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.442438 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-config-data\") pod \"caaef22c-e9f4-4ca0-b534-943351985382\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.442467 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caaef22c-e9f4-4ca0-b534-943351985382-log-httpd\") pod \"caaef22c-e9f4-4ca0-b534-943351985382\" (UID: \"caaef22c-e9f4-4ca0-b534-943351985382\") " Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.442738 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt7vc\" (UniqueName: \"kubernetes.io/projected/3b54c615-8156-4ec6-aee7-b8c9448a574e-kube-api-access-gt7vc\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.442943 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caaef22c-e9f4-4ca0-b534-943351985382-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "caaef22c-e9f4-4ca0-b534-943351985382" (UID: "caaef22c-e9f4-4ca0-b534-943351985382"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.443089 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b54c615-8156-4ec6-aee7-b8c9448a574e-logs\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.443135 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.443197 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b54c615-8156-4ec6-aee7-b8c9448a574e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.443227 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b54c615-8156-4ec6-aee7-b8c9448a574e-scripts\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.443231 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caaef22c-e9f4-4ca0-b534-943351985382-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "caaef22c-e9f4-4ca0-b534-943351985382" (UID: "caaef22c-e9f4-4ca0-b534-943351985382"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.443260 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b54c615-8156-4ec6-aee7-b8c9448a574e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.443423 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b54c615-8156-4ec6-aee7-b8c9448a574e-config-data\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.443464 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b54c615-8156-4ec6-aee7-b8c9448a574e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.443720 4923 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caaef22c-e9f4-4ca0-b534-943351985382-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.443751 4923 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caaef22c-e9f4-4ca0-b534-943351985382-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.467774 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-scripts" (OuterVolumeSpecName: "scripts") pod "caaef22c-e9f4-4ca0-b534-943351985382" (UID: "caaef22c-e9f4-4ca0-b534-943351985382"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.467909 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caaef22c-e9f4-4ca0-b534-943351985382-kube-api-access-g4clc" (OuterVolumeSpecName: "kube-api-access-g4clc") pod "caaef22c-e9f4-4ca0-b534-943351985382" (UID: "caaef22c-e9f4-4ca0-b534-943351985382"). InnerVolumeSpecName "kube-api-access-g4clc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.475723 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "caaef22c-e9f4-4ca0-b534-943351985382" (UID: "caaef22c-e9f4-4ca0-b534-943351985382"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.528620 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "caaef22c-e9f4-4ca0-b534-943351985382" (UID: "caaef22c-e9f4-4ca0-b534-943351985382"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.546353 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt7vc\" (UniqueName: \"kubernetes.io/projected/3b54c615-8156-4ec6-aee7-b8c9448a574e-kube-api-access-gt7vc\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.546432 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b54c615-8156-4ec6-aee7-b8c9448a574e-logs\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.546455 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.546485 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b54c615-8156-4ec6-aee7-b8c9448a574e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.546503 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b54c615-8156-4ec6-aee7-b8c9448a574e-scripts\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.546523 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b54c615-8156-4ec6-aee7-b8c9448a574e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.546540 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b54c615-8156-4ec6-aee7-b8c9448a574e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.546555 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b54c615-8156-4ec6-aee7-b8c9448a574e-config-data\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.546665 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.546676 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4clc\" (UniqueName: \"kubernetes.io/projected/caaef22c-e9f4-4ca0-b534-943351985382-kube-api-access-g4clc\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.546686 4923 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.546696 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.547725 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b54c615-8156-4ec6-aee7-b8c9448a574e-logs\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.548272 4923 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.548365 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b54c615-8156-4ec6-aee7-b8c9448a574e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.554623 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b54c615-8156-4ec6-aee7-b8c9448a574e-scripts\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.555970 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b54c615-8156-4ec6-aee7-b8c9448a574e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.560042 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b54c615-8156-4ec6-aee7-b8c9448a574e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.565015 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt7vc\" (UniqueName: \"kubernetes.io/projected/3b54c615-8156-4ec6-aee7-b8c9448a574e-kube-api-access-gt7vc\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.573624 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b54c615-8156-4ec6-aee7-b8c9448a574e-config-data\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.579126 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"3b54c615-8156-4ec6-aee7-b8c9448a574e\") " pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.587622 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-config-data" (OuterVolumeSpecName: "config-data") pod "caaef22c-e9f4-4ca0-b534-943351985382" (UID: "caaef22c-e9f4-4ca0-b534-943351985382"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.647972 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caaef22c-e9f4-4ca0-b534-943351985382-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.662459 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.779605 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8517902-92e0-4ee1-8765-9d7331ac90f4" path="/var/lib/kubelet/pods/c8517902-92e0-4ee1-8765-9d7331ac90f4/volumes" Feb 24 03:14:41 crc kubenswrapper[4923]: I0224 03:14:41.995508 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.167235 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxj7g\" (UniqueName: \"kubernetes.io/projected/cb08edd9-6041-489a-8713-8bc00d88527c-kube-api-access-kxj7g\") pod \"cb08edd9-6041-489a-8713-8bc00d88527c\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.167331 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb08edd9-6041-489a-8713-8bc00d88527c-logs\") pod \"cb08edd9-6041-489a-8713-8bc00d88527c\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.167356 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-scripts\") pod \"cb08edd9-6041-489a-8713-8bc00d88527c\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.167379 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cb08edd9-6041-489a-8713-8bc00d88527c\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.167400 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb08edd9-6041-489a-8713-8bc00d88527c-httpd-run\") pod \"cb08edd9-6041-489a-8713-8bc00d88527c\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.167450 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-config-data\") pod \"cb08edd9-6041-489a-8713-8bc00d88527c\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.167506 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-combined-ca-bundle\") pod \"cb08edd9-6041-489a-8713-8bc00d88527c\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.167566 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-internal-tls-certs\") pod \"cb08edd9-6041-489a-8713-8bc00d88527c\" (UID: \"cb08edd9-6041-489a-8713-8bc00d88527c\") " Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.171323 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb08edd9-6041-489a-8713-8bc00d88527c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cb08edd9-6041-489a-8713-8bc00d88527c" (UID: "cb08edd9-6041-489a-8713-8bc00d88527c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.172333 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb08edd9-6041-489a-8713-8bc00d88527c-logs" (OuterVolumeSpecName: "logs") pod "cb08edd9-6041-489a-8713-8bc00d88527c" (UID: "cb08edd9-6041-489a-8713-8bc00d88527c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.172505 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb08edd9-6041-489a-8713-8bc00d88527c-kube-api-access-kxj7g" (OuterVolumeSpecName: "kube-api-access-kxj7g") pod "cb08edd9-6041-489a-8713-8bc00d88527c" (UID: "cb08edd9-6041-489a-8713-8bc00d88527c"). InnerVolumeSpecName "kube-api-access-kxj7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.175592 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "cb08edd9-6041-489a-8713-8bc00d88527c" (UID: "cb08edd9-6041-489a-8713-8bc00d88527c"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.183676 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-scripts" (OuterVolumeSpecName: "scripts") pod "cb08edd9-6041-489a-8713-8bc00d88527c" (UID: "cb08edd9-6041-489a-8713-8bc00d88527c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.222530 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-config-data" (OuterVolumeSpecName: "config-data") pod "cb08edd9-6041-489a-8713-8bc00d88527c" (UID: "cb08edd9-6041-489a-8713-8bc00d88527c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.224493 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb08edd9-6041-489a-8713-8bc00d88527c" (UID: "cb08edd9-6041-489a-8713-8bc00d88527c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.256395 4923 generic.go:334] "Generic (PLEG): container finished" podID="cb08edd9-6041-489a-8713-8bc00d88527c" containerID="f93d040957448cccd4cc84ffac2372edd5288de98958e37af6902ced0e9393f4" exitCode=0 Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.256465 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.256488 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.256516 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb08edd9-6041-489a-8713-8bc00d88527c","Type":"ContainerDied","Data":"f93d040957448cccd4cc84ffac2372edd5288de98958e37af6902ced0e9393f4"} Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.256552 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cb08edd9-6041-489a-8713-8bc00d88527c","Type":"ContainerDied","Data":"f53b42741e1f2e4975286faad6425513bea80193822a908fece55138d420cd77"} Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.256572 4923 scope.go:117] "RemoveContainer" containerID="f93d040957448cccd4cc84ffac2372edd5288de98958e37af6902ced0e9393f4" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.262813 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cb08edd9-6041-489a-8713-8bc00d88527c" (UID: "cb08edd9-6041-489a-8713-8bc00d88527c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.269474 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.269501 4923 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.269514 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxj7g\" (UniqueName: \"kubernetes.io/projected/cb08edd9-6041-489a-8713-8bc00d88527c-kube-api-access-kxj7g\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.269529 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.269541 4923 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb08edd9-6041-489a-8713-8bc00d88527c-logs\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.269575 4923 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.269589 4923 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cb08edd9-6041-489a-8713-8bc00d88527c-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.269601 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb08edd9-6041-489a-8713-8bc00d88527c-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.287715 4923 scope.go:117] "RemoveContainer" containerID="8fccc6770b4bd1f1d31a27de5050d7a38775f44d0e2e5ade0fb320a0addbdf11" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.295725 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.298163 4923 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.315282 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.317674 4923 scope.go:117] "RemoveContainer" containerID="f93d040957448cccd4cc84ffac2372edd5288de98958e37af6902ced0e9393f4" Feb 24 03:14:42 crc kubenswrapper[4923]: E0224 03:14:42.334156 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f93d040957448cccd4cc84ffac2372edd5288de98958e37af6902ced0e9393f4\": container with ID starting with f93d040957448cccd4cc84ffac2372edd5288de98958e37af6902ced0e9393f4 not found: ID does not exist" containerID="f93d040957448cccd4cc84ffac2372edd5288de98958e37af6902ced0e9393f4" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.334237 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f93d040957448cccd4cc84ffac2372edd5288de98958e37af6902ced0e9393f4"} err="failed to get container status \"f93d040957448cccd4cc84ffac2372edd5288de98958e37af6902ced0e9393f4\": rpc error: code = NotFound desc = could not find container \"f93d040957448cccd4cc84ffac2372edd5288de98958e37af6902ced0e9393f4\": container with ID starting with f93d040957448cccd4cc84ffac2372edd5288de98958e37af6902ced0e9393f4 not found: ID does not exist" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.334263 4923 scope.go:117] "RemoveContainer" containerID="8fccc6770b4bd1f1d31a27de5050d7a38775f44d0e2e5ade0fb320a0addbdf11" Feb 24 03:14:42 crc kubenswrapper[4923]: E0224 03:14:42.334816 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fccc6770b4bd1f1d31a27de5050d7a38775f44d0e2e5ade0fb320a0addbdf11\": container with ID starting with 8fccc6770b4bd1f1d31a27de5050d7a38775f44d0e2e5ade0fb320a0addbdf11 not found: ID does not exist" containerID="8fccc6770b4bd1f1d31a27de5050d7a38775f44d0e2e5ade0fb320a0addbdf11" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.334856 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fccc6770b4bd1f1d31a27de5050d7a38775f44d0e2e5ade0fb320a0addbdf11"} err="failed to get container status \"8fccc6770b4bd1f1d31a27de5050d7a38775f44d0e2e5ade0fb320a0addbdf11\": rpc error: code = NotFound desc = could not find container \"8fccc6770b4bd1f1d31a27de5050d7a38775f44d0e2e5ade0fb320a0addbdf11\": container with ID starting with 8fccc6770b4bd1f1d31a27de5050d7a38775f44d0e2e5ade0fb320a0addbdf11 not found: ID does not exist" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.340391 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:42 crc kubenswrapper[4923]: E0224 03:14:42.341134 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb08edd9-6041-489a-8713-8bc00d88527c" containerName="glance-log" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.341154 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb08edd9-6041-489a-8713-8bc00d88527c" containerName="glance-log" Feb 24 03:14:42 crc kubenswrapper[4923]: E0224 03:14:42.341180 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb08edd9-6041-489a-8713-8bc00d88527c" containerName="glance-httpd" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.341186 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb08edd9-6041-489a-8713-8bc00d88527c" containerName="glance-httpd" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.343704 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb08edd9-6041-489a-8713-8bc00d88527c" containerName="glance-log" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.343738 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb08edd9-6041-489a-8713-8bc00d88527c" containerName="glance-httpd" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.351360 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.360847 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.360855 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.365554 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.373502 4923 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.429324 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 24 03:14:42 crc kubenswrapper[4923]: W0224 03:14:42.431241 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b54c615_8156_4ec6_aee7_b8c9448a574e.slice/crio-06060645fb1569a7f6e74a4e6456c3cce4219a8a154dbe7504ed061601d8e605 WatchSource:0}: Error finding container 06060645fb1569a7f6e74a4e6456c3cce4219a8a154dbe7504ed061601d8e605: Status 404 returned error can't find the container with id 06060645fb1569a7f6e74a4e6456c3cce4219a8a154dbe7504ed061601d8e605 Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.475359 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.475413 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbvn4\" (UniqueName: \"kubernetes.io/projected/2c2bfcb9-4317-431a-a638-066d25df150a-kube-api-access-nbvn4\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.475443 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c2bfcb9-4317-431a-a638-066d25df150a-run-httpd\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.475466 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-scripts\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.475487 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.475520 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-config-data\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.475847 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c2bfcb9-4317-431a-a638-066d25df150a-log-httpd\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.577996 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.578042 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbvn4\" (UniqueName: \"kubernetes.io/projected/2c2bfcb9-4317-431a-a638-066d25df150a-kube-api-access-nbvn4\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.578069 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c2bfcb9-4317-431a-a638-066d25df150a-run-httpd\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.578090 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-scripts\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.578108 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.578143 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-config-data\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.578189 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c2bfcb9-4317-431a-a638-066d25df150a-log-httpd\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.578773 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c2bfcb9-4317-431a-a638-066d25df150a-run-httpd\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.580272 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c2bfcb9-4317-431a-a638-066d25df150a-log-httpd\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.583033 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.589420 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-scripts\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.590043 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-config-data\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.591866 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.599513 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.600236 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbvn4\" (UniqueName: \"kubernetes.io/projected/2c2bfcb9-4317-431a-a638-066d25df150a-kube-api-access-nbvn4\") pod \"ceilometer-0\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.611468 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.623344 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.625455 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.630221 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.630595 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.653179 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.679458 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2656e2e0-085f-443d-ad1c-2243a4f92a11-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.679500 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgg6q\" (UniqueName: \"kubernetes.io/projected/2656e2e0-085f-443d-ad1c-2243a4f92a11-kube-api-access-jgg6q\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.679577 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2656e2e0-085f-443d-ad1c-2243a4f92a11-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.679606 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2656e2e0-085f-443d-ad1c-2243a4f92a11-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.679664 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.679692 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2656e2e0-085f-443d-ad1c-2243a4f92a11-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.679714 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2656e2e0-085f-443d-ad1c-2243a4f92a11-logs\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.679731 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2656e2e0-085f-443d-ad1c-2243a4f92a11-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.685683 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.781620 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgg6q\" (UniqueName: \"kubernetes.io/projected/2656e2e0-085f-443d-ad1c-2243a4f92a11-kube-api-access-jgg6q\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.782363 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2656e2e0-085f-443d-ad1c-2243a4f92a11-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.782410 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2656e2e0-085f-443d-ad1c-2243a4f92a11-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.782492 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.782528 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2656e2e0-085f-443d-ad1c-2243a4f92a11-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.782548 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2656e2e0-085f-443d-ad1c-2243a4f92a11-logs\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.782570 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2656e2e0-085f-443d-ad1c-2243a4f92a11-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.782621 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2656e2e0-085f-443d-ad1c-2243a4f92a11-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.784492 4923 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.785180 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2656e2e0-085f-443d-ad1c-2243a4f92a11-logs\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.786518 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2656e2e0-085f-443d-ad1c-2243a4f92a11-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.804864 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2656e2e0-085f-443d-ad1c-2243a4f92a11-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.809645 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2656e2e0-085f-443d-ad1c-2243a4f92a11-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.812808 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2656e2e0-085f-443d-ad1c-2243a4f92a11-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.822545 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgg6q\" (UniqueName: \"kubernetes.io/projected/2656e2e0-085f-443d-ad1c-2243a4f92a11-kube-api-access-jgg6q\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.823175 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2656e2e0-085f-443d-ad1c-2243a4f92a11-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.826729 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"2656e2e0-085f-443d-ad1c-2243a4f92a11\") " pod="openstack/glance-default-internal-api-0" Feb 24 03:14:42 crc kubenswrapper[4923]: I0224 03:14:42.975568 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 24 03:14:43 crc kubenswrapper[4923]: I0224 03:14:43.150099 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:43 crc kubenswrapper[4923]: I0224 03:14:43.272321 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3b54c615-8156-4ec6-aee7-b8c9448a574e","Type":"ContainerStarted","Data":"5cfbec0431df8143f507d5a86e1c357a24abbe7b34f9f7f8b84ef027254197b1"} Feb 24 03:14:43 crc kubenswrapper[4923]: I0224 03:14:43.272595 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3b54c615-8156-4ec6-aee7-b8c9448a574e","Type":"ContainerStarted","Data":"06060645fb1569a7f6e74a4e6456c3cce4219a8a154dbe7504ed061601d8e605"} Feb 24 03:14:43 crc kubenswrapper[4923]: I0224 03:14:43.273416 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c2bfcb9-4317-431a-a638-066d25df150a","Type":"ContainerStarted","Data":"60abcca2074c9f3e836e12a001abf0bccff22daff0a29d870e9d5bdfdcf22b88"} Feb 24 03:14:43 crc kubenswrapper[4923]: I0224 03:14:43.542011 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 24 03:14:43 crc kubenswrapper[4923]: I0224 03:14:43.727686 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caaef22c-e9f4-4ca0-b534-943351985382" path="/var/lib/kubelet/pods/caaef22c-e9f4-4ca0-b534-943351985382/volumes" Feb 24 03:14:43 crc kubenswrapper[4923]: I0224 03:14:43.728693 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb08edd9-6041-489a-8713-8bc00d88527c" path="/var/lib/kubelet/pods/cb08edd9-6041-489a-8713-8bc00d88527c/volumes" Feb 24 03:14:44 crc kubenswrapper[4923]: I0224 03:14:44.290749 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3b54c615-8156-4ec6-aee7-b8c9448a574e","Type":"ContainerStarted","Data":"9470a8292b3ad9fc4e9364ae6b422c4deb5461fd47a78188abd660ff6a5bf768"} Feb 24 03:14:44 crc kubenswrapper[4923]: I0224 03:14:44.295351 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c2bfcb9-4317-431a-a638-066d25df150a","Type":"ContainerStarted","Data":"d542a6a62c162013c247e97826918eca9f0f6fcbe39e3ba52ae0217a758ec9fe"} Feb 24 03:14:44 crc kubenswrapper[4923]: I0224 03:14:44.298656 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2656e2e0-085f-443d-ad1c-2243a4f92a11","Type":"ContainerStarted","Data":"e282bd48e6013d4b605cbe579e3f760af4dfcd3b16545989fbf0777d36a8bc09"} Feb 24 03:14:44 crc kubenswrapper[4923]: I0224 03:14:44.298680 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2656e2e0-085f-443d-ad1c-2243a4f92a11","Type":"ContainerStarted","Data":"8f813f11c09f545f64e1a09f5afdacbae9449762c9694aa36229bd4649470559"} Feb 24 03:14:45 crc kubenswrapper[4923]: I0224 03:14:45.308226 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c2bfcb9-4317-431a-a638-066d25df150a","Type":"ContainerStarted","Data":"9cd08b8b0f1ee989e679462e979faa89a78e9dac9df74abc1732c4c0d380eb70"} Feb 24 03:14:45 crc kubenswrapper[4923]: I0224 03:14:45.308675 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c2bfcb9-4317-431a-a638-066d25df150a","Type":"ContainerStarted","Data":"4060a5d39d30839ad38d669aab03272688b325a6bc8499c474834c4511af5692"} Feb 24 03:14:45 crc kubenswrapper[4923]: I0224 03:14:45.311175 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2656e2e0-085f-443d-ad1c-2243a4f92a11","Type":"ContainerStarted","Data":"4b4bf187c255673bc989387a1d5f23be864429f7e60ce80045e2b1c67d3dea4f"} Feb 24 03:14:45 crc kubenswrapper[4923]: I0224 03:14:45.335648 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.3356277 podStartE2EDuration="3.3356277s" podCreationTimestamp="2026-02-24 03:14:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:14:45.332312013 +0000 UTC m=+1209.349382826" watchObservedRunningTime="2026-02-24 03:14:45.3356277 +0000 UTC m=+1209.352698513" Feb 24 03:14:45 crc kubenswrapper[4923]: I0224 03:14:45.339583 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.339568373 podStartE2EDuration="4.339568373s" podCreationTimestamp="2026-02-24 03:14:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:14:44.323067396 +0000 UTC m=+1208.340138239" watchObservedRunningTime="2026-02-24 03:14:45.339568373 +0000 UTC m=+1209.356639186" Feb 24 03:14:47 crc kubenswrapper[4923]: I0224 03:14:47.332449 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c2bfcb9-4317-431a-a638-066d25df150a","Type":"ContainerStarted","Data":"bd8c464528befd76994ec42ea7c90a1cc5a65ea817fcc512c967eddcf4c2ce59"} Feb 24 03:14:47 crc kubenswrapper[4923]: I0224 03:14:47.332769 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 24 03:14:47 crc kubenswrapper[4923]: I0224 03:14:47.368982 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.603368259 podStartE2EDuration="5.3689615s" podCreationTimestamp="2026-02-24 03:14:42 +0000 UTC" firstStartedPulling="2026-02-24 03:14:43.174034981 +0000 UTC m=+1207.191105794" lastFinishedPulling="2026-02-24 03:14:46.939628222 +0000 UTC m=+1210.956699035" observedRunningTime="2026-02-24 03:14:47.359443571 +0000 UTC m=+1211.376514394" watchObservedRunningTime="2026-02-24 03:14:47.3689615 +0000 UTC m=+1211.386032323" Feb 24 03:14:51 crc kubenswrapper[4923]: I0224 03:14:51.388088 4923 generic.go:334] "Generic (PLEG): container finished" podID="a5bfccf6-bf49-4a31-9367-afe9f29cbf74" containerID="a6d81afc0f1ea5db3c8e12dd2b3757d1cf37f7e3b919a57b2128d42f99e88eba" exitCode=0 Feb 24 03:14:51 crc kubenswrapper[4923]: I0224 03:14:51.388134 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qlhvh" event={"ID":"a5bfccf6-bf49-4a31-9367-afe9f29cbf74","Type":"ContainerDied","Data":"a6d81afc0f1ea5db3c8e12dd2b3757d1cf37f7e3b919a57b2128d42f99e88eba"} Feb 24 03:14:51 crc kubenswrapper[4923]: I0224 03:14:51.664087 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 24 03:14:51 crc kubenswrapper[4923]: I0224 03:14:51.664356 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 24 03:14:51 crc kubenswrapper[4923]: I0224 03:14:51.699808 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 24 03:14:51 crc kubenswrapper[4923]: I0224 03:14:51.708267 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 24 03:14:52 crc kubenswrapper[4923]: I0224 03:14:52.400601 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 24 03:14:52 crc kubenswrapper[4923]: I0224 03:14:52.401092 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 24 03:14:52 crc kubenswrapper[4923]: I0224 03:14:52.747000 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qlhvh" Feb 24 03:14:52 crc kubenswrapper[4923]: I0224 03:14:52.898657 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-scripts\") pod \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\" (UID: \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\") " Feb 24 03:14:52 crc kubenswrapper[4923]: I0224 03:14:52.899044 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-combined-ca-bundle\") pod \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\" (UID: \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\") " Feb 24 03:14:52 crc kubenswrapper[4923]: I0224 03:14:52.899099 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-config-data\") pod \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\" (UID: \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\") " Feb 24 03:14:52 crc kubenswrapper[4923]: I0224 03:14:52.899137 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tp96\" (UniqueName: \"kubernetes.io/projected/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-kube-api-access-4tp96\") pod \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\" (UID: \"a5bfccf6-bf49-4a31-9367-afe9f29cbf74\") " Feb 24 03:14:52 crc kubenswrapper[4923]: I0224 03:14:52.904523 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-scripts" (OuterVolumeSpecName: "scripts") pod "a5bfccf6-bf49-4a31-9367-afe9f29cbf74" (UID: "a5bfccf6-bf49-4a31-9367-afe9f29cbf74"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:52 crc kubenswrapper[4923]: I0224 03:14:52.921061 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-kube-api-access-4tp96" (OuterVolumeSpecName: "kube-api-access-4tp96") pod "a5bfccf6-bf49-4a31-9367-afe9f29cbf74" (UID: "a5bfccf6-bf49-4a31-9367-afe9f29cbf74"). InnerVolumeSpecName "kube-api-access-4tp96". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:14:52 crc kubenswrapper[4923]: I0224 03:14:52.932265 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-config-data" (OuterVolumeSpecName: "config-data") pod "a5bfccf6-bf49-4a31-9367-afe9f29cbf74" (UID: "a5bfccf6-bf49-4a31-9367-afe9f29cbf74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:52 crc kubenswrapper[4923]: I0224 03:14:52.935057 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5bfccf6-bf49-4a31-9367-afe9f29cbf74" (UID: "a5bfccf6-bf49-4a31-9367-afe9f29cbf74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:14:52 crc kubenswrapper[4923]: I0224 03:14:52.976659 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 24 03:14:52 crc kubenswrapper[4923]: I0224 03:14:52.976703 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.000740 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.000776 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.000785 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.000795 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tp96\" (UniqueName: \"kubernetes.io/projected/a5bfccf6-bf49-4a31-9367-afe9f29cbf74-kube-api-access-4tp96\") on node \"crc\" DevicePath \"\"" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.017099 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.026475 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.412836 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qlhvh" event={"ID":"a5bfccf6-bf49-4a31-9367-afe9f29cbf74","Type":"ContainerDied","Data":"9b2005a22651404f5e35fb2efcd4a507457f762849a80c7cca376e10f42d1a96"} Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.412916 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b2005a22651404f5e35fb2efcd4a507457f762849a80c7cca376e10f42d1a96" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.412959 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qlhvh" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.414359 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.414407 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.544172 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 24 03:14:53 crc kubenswrapper[4923]: E0224 03:14:53.544725 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bfccf6-bf49-4a31-9367-afe9f29cbf74" containerName="nova-cell0-conductor-db-sync" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.544752 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bfccf6-bf49-4a31-9367-afe9f29cbf74" containerName="nova-cell0-conductor-db-sync" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.545005 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5bfccf6-bf49-4a31-9367-afe9f29cbf74" containerName="nova-cell0-conductor-db-sync" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.546030 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.548380 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gg5bz" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.548569 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.558741 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.614282 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27299c6d-01f7-467b-a3b0-2456e190670e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"27299c6d-01f7-467b-a3b0-2456e190670e\") " pod="openstack/nova-cell0-conductor-0" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.614511 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8n56\" (UniqueName: \"kubernetes.io/projected/27299c6d-01f7-467b-a3b0-2456e190670e-kube-api-access-f8n56\") pod \"nova-cell0-conductor-0\" (UID: \"27299c6d-01f7-467b-a3b0-2456e190670e\") " pod="openstack/nova-cell0-conductor-0" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.614550 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27299c6d-01f7-467b-a3b0-2456e190670e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"27299c6d-01f7-467b-a3b0-2456e190670e\") " pod="openstack/nova-cell0-conductor-0" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.718014 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27299c6d-01f7-467b-a3b0-2456e190670e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"27299c6d-01f7-467b-a3b0-2456e190670e\") " pod="openstack/nova-cell0-conductor-0" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.718100 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8n56\" (UniqueName: \"kubernetes.io/projected/27299c6d-01f7-467b-a3b0-2456e190670e-kube-api-access-f8n56\") pod \"nova-cell0-conductor-0\" (UID: \"27299c6d-01f7-467b-a3b0-2456e190670e\") " pod="openstack/nova-cell0-conductor-0" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.718124 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27299c6d-01f7-467b-a3b0-2456e190670e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"27299c6d-01f7-467b-a3b0-2456e190670e\") " pod="openstack/nova-cell0-conductor-0" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.725408 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27299c6d-01f7-467b-a3b0-2456e190670e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"27299c6d-01f7-467b-a3b0-2456e190670e\") " pod="openstack/nova-cell0-conductor-0" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.733981 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27299c6d-01f7-467b-a3b0-2456e190670e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"27299c6d-01f7-467b-a3b0-2456e190670e\") " pod="openstack/nova-cell0-conductor-0" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.754161 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8n56\" (UniqueName: \"kubernetes.io/projected/27299c6d-01f7-467b-a3b0-2456e190670e-kube-api-access-f8n56\") pod \"nova-cell0-conductor-0\" (UID: \"27299c6d-01f7-467b-a3b0-2456e190670e\") " pod="openstack/nova-cell0-conductor-0" Feb 24 03:14:53 crc kubenswrapper[4923]: I0224 03:14:53.919788 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 24 03:14:54 crc kubenswrapper[4923]: I0224 03:14:54.397285 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 24 03:14:54 crc kubenswrapper[4923]: I0224 03:14:54.438919 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"27299c6d-01f7-467b-a3b0-2456e190670e","Type":"ContainerStarted","Data":"6b0b84588ec053b8975caee1b6cbe843ae2209ab63ed139e2e8557d8ae572405"} Feb 24 03:14:54 crc kubenswrapper[4923]: I0224 03:14:54.498563 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 24 03:14:54 crc kubenswrapper[4923]: I0224 03:14:54.498650 4923 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 24 03:14:54 crc kubenswrapper[4923]: I0224 03:14:54.594283 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 24 03:14:55 crc kubenswrapper[4923]: I0224 03:14:55.448511 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"27299c6d-01f7-467b-a3b0-2456e190670e","Type":"ContainerStarted","Data":"d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084"} Feb 24 03:14:55 crc kubenswrapper[4923]: I0224 03:14:55.449121 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 24 03:14:55 crc kubenswrapper[4923]: I0224 03:14:55.465941 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.4659200820000002 podStartE2EDuration="2.465920082s" podCreationTimestamp="2026-02-24 03:14:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:14:55.464452773 +0000 UTC m=+1219.481523586" watchObservedRunningTime="2026-02-24 03:14:55.465920082 +0000 UTC m=+1219.482990895" Feb 24 03:14:55 crc kubenswrapper[4923]: I0224 03:14:55.539258 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 24 03:14:55 crc kubenswrapper[4923]: I0224 03:14:55.539360 4923 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 24 03:14:55 crc kubenswrapper[4923]: I0224 03:14:55.563755 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 24 03:14:56 crc kubenswrapper[4923]: I0224 03:14:56.342267 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 24 03:14:57 crc kubenswrapper[4923]: I0224 03:14:57.462589 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="27299c6d-01f7-467b-a3b0-2456e190670e" containerName="nova-cell0-conductor-conductor" containerID="cri-o://d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084" gracePeriod=30 Feb 24 03:14:58 crc kubenswrapper[4923]: I0224 03:14:58.407153 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:14:58 crc kubenswrapper[4923]: I0224 03:14:58.407661 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c2bfcb9-4317-431a-a638-066d25df150a" containerName="proxy-httpd" containerID="cri-o://bd8c464528befd76994ec42ea7c90a1cc5a65ea817fcc512c967eddcf4c2ce59" gracePeriod=30 Feb 24 03:14:58 crc kubenswrapper[4923]: I0224 03:14:58.407773 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c2bfcb9-4317-431a-a638-066d25df150a" containerName="sg-core" containerID="cri-o://9cd08b8b0f1ee989e679462e979faa89a78e9dac9df74abc1732c4c0d380eb70" gracePeriod=30 Feb 24 03:14:58 crc kubenswrapper[4923]: I0224 03:14:58.408451 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c2bfcb9-4317-431a-a638-066d25df150a" containerName="ceilometer-central-agent" containerID="cri-o://d542a6a62c162013c247e97826918eca9f0f6fcbe39e3ba52ae0217a758ec9fe" gracePeriod=30 Feb 24 03:14:58 crc kubenswrapper[4923]: I0224 03:14:58.408574 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c2bfcb9-4317-431a-a638-066d25df150a" containerName="ceilometer-notification-agent" containerID="cri-o://4060a5d39d30839ad38d669aab03272688b325a6bc8499c474834c4511af5692" gracePeriod=30 Feb 24 03:14:58 crc kubenswrapper[4923]: I0224 03:14:58.413588 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 24 03:14:59 crc kubenswrapper[4923]: I0224 03:14:59.490949 4923 generic.go:334] "Generic (PLEG): container finished" podID="2c2bfcb9-4317-431a-a638-066d25df150a" containerID="bd8c464528befd76994ec42ea7c90a1cc5a65ea817fcc512c967eddcf4c2ce59" exitCode=0 Feb 24 03:14:59 crc kubenswrapper[4923]: I0224 03:14:59.491215 4923 generic.go:334] "Generic (PLEG): container finished" podID="2c2bfcb9-4317-431a-a638-066d25df150a" containerID="9cd08b8b0f1ee989e679462e979faa89a78e9dac9df74abc1732c4c0d380eb70" exitCode=2 Feb 24 03:14:59 crc kubenswrapper[4923]: I0224 03:14:59.491225 4923 generic.go:334] "Generic (PLEG): container finished" podID="2c2bfcb9-4317-431a-a638-066d25df150a" containerID="d542a6a62c162013c247e97826918eca9f0f6fcbe39e3ba52ae0217a758ec9fe" exitCode=0 Feb 24 03:14:59 crc kubenswrapper[4923]: I0224 03:14:59.491042 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c2bfcb9-4317-431a-a638-066d25df150a","Type":"ContainerDied","Data":"bd8c464528befd76994ec42ea7c90a1cc5a65ea817fcc512c967eddcf4c2ce59"} Feb 24 03:14:59 crc kubenswrapper[4923]: I0224 03:14:59.491277 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c2bfcb9-4317-431a-a638-066d25df150a","Type":"ContainerDied","Data":"9cd08b8b0f1ee989e679462e979faa89a78e9dac9df74abc1732c4c0d380eb70"} Feb 24 03:14:59 crc kubenswrapper[4923]: I0224 03:14:59.491343 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c2bfcb9-4317-431a-a638-066d25df150a","Type":"ContainerDied","Data":"d542a6a62c162013c247e97826918eca9f0f6fcbe39e3ba52ae0217a758ec9fe"} Feb 24 03:15:00 crc kubenswrapper[4923]: I0224 03:15:00.132503 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4"] Feb 24 03:15:00 crc kubenswrapper[4923]: I0224 03:15:00.133941 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4" Feb 24 03:15:00 crc kubenswrapper[4923]: I0224 03:15:00.135994 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 24 03:15:00 crc kubenswrapper[4923]: I0224 03:15:00.136159 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 24 03:15:00 crc kubenswrapper[4923]: I0224 03:15:00.149713 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4"] Feb 24 03:15:00 crc kubenswrapper[4923]: I0224 03:15:00.263088 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8812730-dcd0-44d4-a795-256a1c1810e4-config-volume\") pod \"collect-profiles-29531715-r92t4\" (UID: \"e8812730-dcd0-44d4-a795-256a1c1810e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4" Feb 24 03:15:00 crc kubenswrapper[4923]: I0224 03:15:00.263165 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8812730-dcd0-44d4-a795-256a1c1810e4-secret-volume\") pod \"collect-profiles-29531715-r92t4\" (UID: \"e8812730-dcd0-44d4-a795-256a1c1810e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4" Feb 24 03:15:00 crc kubenswrapper[4923]: I0224 03:15:00.263210 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5mrl\" (UniqueName: \"kubernetes.io/projected/e8812730-dcd0-44d4-a795-256a1c1810e4-kube-api-access-x5mrl\") pod \"collect-profiles-29531715-r92t4\" (UID: \"e8812730-dcd0-44d4-a795-256a1c1810e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4" Feb 24 03:15:00 crc kubenswrapper[4923]: I0224 03:15:00.364425 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8812730-dcd0-44d4-a795-256a1c1810e4-secret-volume\") pod \"collect-profiles-29531715-r92t4\" (UID: \"e8812730-dcd0-44d4-a795-256a1c1810e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4" Feb 24 03:15:00 crc kubenswrapper[4923]: I0224 03:15:00.364519 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5mrl\" (UniqueName: \"kubernetes.io/projected/e8812730-dcd0-44d4-a795-256a1c1810e4-kube-api-access-x5mrl\") pod \"collect-profiles-29531715-r92t4\" (UID: \"e8812730-dcd0-44d4-a795-256a1c1810e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4" Feb 24 03:15:00 crc kubenswrapper[4923]: I0224 03:15:00.364630 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8812730-dcd0-44d4-a795-256a1c1810e4-config-volume\") pod \"collect-profiles-29531715-r92t4\" (UID: \"e8812730-dcd0-44d4-a795-256a1c1810e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4" Feb 24 03:15:00 crc kubenswrapper[4923]: I0224 03:15:00.365859 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8812730-dcd0-44d4-a795-256a1c1810e4-config-volume\") pod \"collect-profiles-29531715-r92t4\" (UID: \"e8812730-dcd0-44d4-a795-256a1c1810e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4" Feb 24 03:15:00 crc kubenswrapper[4923]: I0224 03:15:00.370377 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8812730-dcd0-44d4-a795-256a1c1810e4-secret-volume\") pod \"collect-profiles-29531715-r92t4\" (UID: \"e8812730-dcd0-44d4-a795-256a1c1810e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4" Feb 24 03:15:00 crc kubenswrapper[4923]: I0224 03:15:00.384408 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5mrl\" (UniqueName: \"kubernetes.io/projected/e8812730-dcd0-44d4-a795-256a1c1810e4-kube-api-access-x5mrl\") pod \"collect-profiles-29531715-r92t4\" (UID: \"e8812730-dcd0-44d4-a795-256a1c1810e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4" Feb 24 03:15:00 crc kubenswrapper[4923]: I0224 03:15:00.456246 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.038499 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4"] Feb 24 03:15:01 crc kubenswrapper[4923]: W0224 03:15:01.042079 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8812730_dcd0_44d4_a795_256a1c1810e4.slice/crio-bf526c6aa2f6f47c58e773f992a829185d1604f976c93c2c4736494ebb74639e WatchSource:0}: Error finding container bf526c6aa2f6f47c58e773f992a829185d1604f976c93c2c4736494ebb74639e: Status 404 returned error can't find the container with id bf526c6aa2f6f47c58e773f992a829185d1604f976c93c2c4736494ebb74639e Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.058070 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.183648 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-combined-ca-bundle\") pod \"2c2bfcb9-4317-431a-a638-066d25df150a\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.183711 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c2bfcb9-4317-431a-a638-066d25df150a-run-httpd\") pod \"2c2bfcb9-4317-431a-a638-066d25df150a\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.183735 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c2bfcb9-4317-431a-a638-066d25df150a-log-httpd\") pod \"2c2bfcb9-4317-431a-a638-066d25df150a\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.183773 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-config-data\") pod \"2c2bfcb9-4317-431a-a638-066d25df150a\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.183873 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbvn4\" (UniqueName: \"kubernetes.io/projected/2c2bfcb9-4317-431a-a638-066d25df150a-kube-api-access-nbvn4\") pod \"2c2bfcb9-4317-431a-a638-066d25df150a\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.183901 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-sg-core-conf-yaml\") pod \"2c2bfcb9-4317-431a-a638-066d25df150a\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.183981 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-scripts\") pod \"2c2bfcb9-4317-431a-a638-066d25df150a\" (UID: \"2c2bfcb9-4317-431a-a638-066d25df150a\") " Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.184064 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c2bfcb9-4317-431a-a638-066d25df150a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2c2bfcb9-4317-431a-a638-066d25df150a" (UID: "2c2bfcb9-4317-431a-a638-066d25df150a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.184205 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c2bfcb9-4317-431a-a638-066d25df150a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2c2bfcb9-4317-431a-a638-066d25df150a" (UID: "2c2bfcb9-4317-431a-a638-066d25df150a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.184496 4923 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c2bfcb9-4317-431a-a638-066d25df150a-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.184509 4923 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c2bfcb9-4317-431a-a638-066d25df150a-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.189671 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-scripts" (OuterVolumeSpecName: "scripts") pod "2c2bfcb9-4317-431a-a638-066d25df150a" (UID: "2c2bfcb9-4317-431a-a638-066d25df150a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.189777 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c2bfcb9-4317-431a-a638-066d25df150a-kube-api-access-nbvn4" (OuterVolumeSpecName: "kube-api-access-nbvn4") pod "2c2bfcb9-4317-431a-a638-066d25df150a" (UID: "2c2bfcb9-4317-431a-a638-066d25df150a"). InnerVolumeSpecName "kube-api-access-nbvn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.220008 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2c2bfcb9-4317-431a-a638-066d25df150a" (UID: "2c2bfcb9-4317-431a-a638-066d25df150a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.279730 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c2bfcb9-4317-431a-a638-066d25df150a" (UID: "2c2bfcb9-4317-431a-a638-066d25df150a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.286116 4923 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.286145 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.286154 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.286165 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbvn4\" (UniqueName: \"kubernetes.io/projected/2c2bfcb9-4317-431a-a638-066d25df150a-kube-api-access-nbvn4\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.289103 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-config-data" (OuterVolumeSpecName: "config-data") pod "2c2bfcb9-4317-431a-a638-066d25df150a" (UID: "2c2bfcb9-4317-431a-a638-066d25df150a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.388173 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c2bfcb9-4317-431a-a638-066d25df150a-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.516752 4923 generic.go:334] "Generic (PLEG): container finished" podID="2c2bfcb9-4317-431a-a638-066d25df150a" containerID="4060a5d39d30839ad38d669aab03272688b325a6bc8499c474834c4511af5692" exitCode=0 Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.516835 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c2bfcb9-4317-431a-a638-066d25df150a","Type":"ContainerDied","Data":"4060a5d39d30839ad38d669aab03272688b325a6bc8499c474834c4511af5692"} Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.516866 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c2bfcb9-4317-431a-a638-066d25df150a","Type":"ContainerDied","Data":"60abcca2074c9f3e836e12a001abf0bccff22daff0a29d870e9d5bdfdcf22b88"} Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.516857 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.516886 4923 scope.go:117] "RemoveContainer" containerID="bd8c464528befd76994ec42ea7c90a1cc5a65ea817fcc512c967eddcf4c2ce59" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.521345 4923 generic.go:334] "Generic (PLEG): container finished" podID="e8812730-dcd0-44d4-a795-256a1c1810e4" containerID="0adaafb21f3da3758ef9730ab67e02aafcc6f9b5dae51df2c4c4fa12c15bf219" exitCode=0 Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.521384 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4" event={"ID":"e8812730-dcd0-44d4-a795-256a1c1810e4","Type":"ContainerDied","Data":"0adaafb21f3da3758ef9730ab67e02aafcc6f9b5dae51df2c4c4fa12c15bf219"} Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.521408 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4" event={"ID":"e8812730-dcd0-44d4-a795-256a1c1810e4","Type":"ContainerStarted","Data":"bf526c6aa2f6f47c58e773f992a829185d1604f976c93c2c4736494ebb74639e"} Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.548651 4923 scope.go:117] "RemoveContainer" containerID="9cd08b8b0f1ee989e679462e979faa89a78e9dac9df74abc1732c4c0d380eb70" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.562431 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.570605 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.581724 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:15:01 crc kubenswrapper[4923]: E0224 03:15:01.582331 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2bfcb9-4317-431a-a638-066d25df150a" containerName="ceilometer-central-agent" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.582413 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2bfcb9-4317-431a-a638-066d25df150a" containerName="ceilometer-central-agent" Feb 24 03:15:01 crc kubenswrapper[4923]: E0224 03:15:01.582504 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2bfcb9-4317-431a-a638-066d25df150a" containerName="sg-core" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.582563 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2bfcb9-4317-431a-a638-066d25df150a" containerName="sg-core" Feb 24 03:15:01 crc kubenswrapper[4923]: E0224 03:15:01.582673 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2bfcb9-4317-431a-a638-066d25df150a" containerName="proxy-httpd" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.582724 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2bfcb9-4317-431a-a638-066d25df150a" containerName="proxy-httpd" Feb 24 03:15:01 crc kubenswrapper[4923]: E0224 03:15:01.582791 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c2bfcb9-4317-431a-a638-066d25df150a" containerName="ceilometer-notification-agent" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.582839 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c2bfcb9-4317-431a-a638-066d25df150a" containerName="ceilometer-notification-agent" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.583055 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2bfcb9-4317-431a-a638-066d25df150a" containerName="proxy-httpd" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.583120 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2bfcb9-4317-431a-a638-066d25df150a" containerName="ceilometer-central-agent" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.583179 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2bfcb9-4317-431a-a638-066d25df150a" containerName="ceilometer-notification-agent" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.583241 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c2bfcb9-4317-431a-a638-066d25df150a" containerName="sg-core" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.582834 4923 scope.go:117] "RemoveContainer" containerID="4060a5d39d30839ad38d669aab03272688b325a6bc8499c474834c4511af5692" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.585097 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.588474 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.588737 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.592765 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.617104 4923 scope.go:117] "RemoveContainer" containerID="d542a6a62c162013c247e97826918eca9f0f6fcbe39e3ba52ae0217a758ec9fe" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.644881 4923 scope.go:117] "RemoveContainer" containerID="bd8c464528befd76994ec42ea7c90a1cc5a65ea817fcc512c967eddcf4c2ce59" Feb 24 03:15:01 crc kubenswrapper[4923]: E0224 03:15:01.645986 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8c464528befd76994ec42ea7c90a1cc5a65ea817fcc512c967eddcf4c2ce59\": container with ID starting with bd8c464528befd76994ec42ea7c90a1cc5a65ea817fcc512c967eddcf4c2ce59 not found: ID does not exist" containerID="bd8c464528befd76994ec42ea7c90a1cc5a65ea817fcc512c967eddcf4c2ce59" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.646034 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8c464528befd76994ec42ea7c90a1cc5a65ea817fcc512c967eddcf4c2ce59"} err="failed to get container status \"bd8c464528befd76994ec42ea7c90a1cc5a65ea817fcc512c967eddcf4c2ce59\": rpc error: code = NotFound desc = could not find container \"bd8c464528befd76994ec42ea7c90a1cc5a65ea817fcc512c967eddcf4c2ce59\": container with ID starting with bd8c464528befd76994ec42ea7c90a1cc5a65ea817fcc512c967eddcf4c2ce59 not found: ID does not exist" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.646112 4923 scope.go:117] "RemoveContainer" containerID="9cd08b8b0f1ee989e679462e979faa89a78e9dac9df74abc1732c4c0d380eb70" Feb 24 03:15:01 crc kubenswrapper[4923]: E0224 03:15:01.646692 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cd08b8b0f1ee989e679462e979faa89a78e9dac9df74abc1732c4c0d380eb70\": container with ID starting with 9cd08b8b0f1ee989e679462e979faa89a78e9dac9df74abc1732c4c0d380eb70 not found: ID does not exist" containerID="9cd08b8b0f1ee989e679462e979faa89a78e9dac9df74abc1732c4c0d380eb70" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.646738 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd08b8b0f1ee989e679462e979faa89a78e9dac9df74abc1732c4c0d380eb70"} err="failed to get container status \"9cd08b8b0f1ee989e679462e979faa89a78e9dac9df74abc1732c4c0d380eb70\": rpc error: code = NotFound desc = could not find container \"9cd08b8b0f1ee989e679462e979faa89a78e9dac9df74abc1732c4c0d380eb70\": container with ID starting with 9cd08b8b0f1ee989e679462e979faa89a78e9dac9df74abc1732c4c0d380eb70 not found: ID does not exist" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.646753 4923 scope.go:117] "RemoveContainer" containerID="4060a5d39d30839ad38d669aab03272688b325a6bc8499c474834c4511af5692" Feb 24 03:15:01 crc kubenswrapper[4923]: E0224 03:15:01.647239 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4060a5d39d30839ad38d669aab03272688b325a6bc8499c474834c4511af5692\": container with ID starting with 4060a5d39d30839ad38d669aab03272688b325a6bc8499c474834c4511af5692 not found: ID does not exist" containerID="4060a5d39d30839ad38d669aab03272688b325a6bc8499c474834c4511af5692" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.647294 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4060a5d39d30839ad38d669aab03272688b325a6bc8499c474834c4511af5692"} err="failed to get container status \"4060a5d39d30839ad38d669aab03272688b325a6bc8499c474834c4511af5692\": rpc error: code = NotFound desc = could not find container \"4060a5d39d30839ad38d669aab03272688b325a6bc8499c474834c4511af5692\": container with ID starting with 4060a5d39d30839ad38d669aab03272688b325a6bc8499c474834c4511af5692 not found: ID does not exist" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.647342 4923 scope.go:117] "RemoveContainer" containerID="d542a6a62c162013c247e97826918eca9f0f6fcbe39e3ba52ae0217a758ec9fe" Feb 24 03:15:01 crc kubenswrapper[4923]: E0224 03:15:01.647674 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d542a6a62c162013c247e97826918eca9f0f6fcbe39e3ba52ae0217a758ec9fe\": container with ID starting with d542a6a62c162013c247e97826918eca9f0f6fcbe39e3ba52ae0217a758ec9fe not found: ID does not exist" containerID="d542a6a62c162013c247e97826918eca9f0f6fcbe39e3ba52ae0217a758ec9fe" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.647708 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d542a6a62c162013c247e97826918eca9f0f6fcbe39e3ba52ae0217a758ec9fe"} err="failed to get container status \"d542a6a62c162013c247e97826918eca9f0f6fcbe39e3ba52ae0217a758ec9fe\": rpc error: code = NotFound desc = could not find container \"d542a6a62c162013c247e97826918eca9f0f6fcbe39e3ba52ae0217a758ec9fe\": container with ID starting with d542a6a62c162013c247e97826918eca9f0f6fcbe39e3ba52ae0217a758ec9fe not found: ID does not exist" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.692841 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-scripts\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.692904 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e64699d-3b40-48c4-822f-f7d6e324aa55-log-httpd\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.692927 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.692957 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm8mz\" (UniqueName: \"kubernetes.io/projected/0e64699d-3b40-48c4-822f-f7d6e324aa55-kube-api-access-sm8mz\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.692977 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e64699d-3b40-48c4-822f-f7d6e324aa55-run-httpd\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.693008 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.693049 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-config-data\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.725095 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c2bfcb9-4317-431a-a638-066d25df150a" path="/var/lib/kubelet/pods/2c2bfcb9-4317-431a-a638-066d25df150a/volumes" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.794340 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-config-data\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.794643 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-scripts\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.794893 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e64699d-3b40-48c4-822f-f7d6e324aa55-log-httpd\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.795288 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.795804 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm8mz\" (UniqueName: \"kubernetes.io/projected/0e64699d-3b40-48c4-822f-f7d6e324aa55-kube-api-access-sm8mz\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.795456 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e64699d-3b40-48c4-822f-f7d6e324aa55-log-httpd\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.795849 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e64699d-3b40-48c4-822f-f7d6e324aa55-run-httpd\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.795923 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.796342 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e64699d-3b40-48c4-822f-f7d6e324aa55-run-httpd\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.798896 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-scripts\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.799014 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.809048 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.809224 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-config-data\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.816059 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm8mz\" (UniqueName: \"kubernetes.io/projected/0e64699d-3b40-48c4-822f-f7d6e324aa55-kube-api-access-sm8mz\") pod \"ceilometer-0\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.913572 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.988221 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 24 03:15:01 crc kubenswrapper[4923]: I0224 03:15:01.988517 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a6bbccab-2bc6-47d9-b1d1-896c661112d8" containerName="kube-state-metrics" containerID="cri-o://201e4cb7602236ffd3ccf5096bfce73551432c5c938d6f55efdf7aea08dd3dfc" gracePeriod=30 Feb 24 03:15:02 crc kubenswrapper[4923]: W0224 03:15:02.412689 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e64699d_3b40_48c4_822f_f7d6e324aa55.slice/crio-00d954a3f6733cf5a20996942a8efe2349aadff6cbb3422eaf74905006f63518 WatchSource:0}: Error finding container 00d954a3f6733cf5a20996942a8efe2349aadff6cbb3422eaf74905006f63518: Status 404 returned error can't find the container with id 00d954a3f6733cf5a20996942a8efe2349aadff6cbb3422eaf74905006f63518 Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.428395 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.525500 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.533457 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e64699d-3b40-48c4-822f-f7d6e324aa55","Type":"ContainerStarted","Data":"00d954a3f6733cf5a20996942a8efe2349aadff6cbb3422eaf74905006f63518"} Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.540574 4923 generic.go:334] "Generic (PLEG): container finished" podID="a6bbccab-2bc6-47d9-b1d1-896c661112d8" containerID="201e4cb7602236ffd3ccf5096bfce73551432c5c938d6f55efdf7aea08dd3dfc" exitCode=2 Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.540752 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.540919 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a6bbccab-2bc6-47d9-b1d1-896c661112d8","Type":"ContainerDied","Data":"201e4cb7602236ffd3ccf5096bfce73551432c5c938d6f55efdf7aea08dd3dfc"} Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.540982 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a6bbccab-2bc6-47d9-b1d1-896c661112d8","Type":"ContainerDied","Data":"932c87fcbd737bd40fd9ed2d63d22ddf51e87c53a9ecd3694c4b576936aa9708"} Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.541003 4923 scope.go:117] "RemoveContainer" containerID="201e4cb7602236ffd3ccf5096bfce73551432c5c938d6f55efdf7aea08dd3dfc" Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.587518 4923 scope.go:117] "RemoveContainer" containerID="201e4cb7602236ffd3ccf5096bfce73551432c5c938d6f55efdf7aea08dd3dfc" Feb 24 03:15:02 crc kubenswrapper[4923]: E0224 03:15:02.591304 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"201e4cb7602236ffd3ccf5096bfce73551432c5c938d6f55efdf7aea08dd3dfc\": container with ID starting with 201e4cb7602236ffd3ccf5096bfce73551432c5c938d6f55efdf7aea08dd3dfc not found: ID does not exist" containerID="201e4cb7602236ffd3ccf5096bfce73551432c5c938d6f55efdf7aea08dd3dfc" Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.591403 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"201e4cb7602236ffd3ccf5096bfce73551432c5c938d6f55efdf7aea08dd3dfc"} err="failed to get container status \"201e4cb7602236ffd3ccf5096bfce73551432c5c938d6f55efdf7aea08dd3dfc\": rpc error: code = NotFound desc = could not find container \"201e4cb7602236ffd3ccf5096bfce73551432c5c938d6f55efdf7aea08dd3dfc\": container with ID starting with 201e4cb7602236ffd3ccf5096bfce73551432c5c938d6f55efdf7aea08dd3dfc not found: ID does not exist" Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.618726 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f2lz\" (UniqueName: \"kubernetes.io/projected/a6bbccab-2bc6-47d9-b1d1-896c661112d8-kube-api-access-5f2lz\") pod \"a6bbccab-2bc6-47d9-b1d1-896c661112d8\" (UID: \"a6bbccab-2bc6-47d9-b1d1-896c661112d8\") " Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.628125 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6bbccab-2bc6-47d9-b1d1-896c661112d8-kube-api-access-5f2lz" (OuterVolumeSpecName: "kube-api-access-5f2lz") pod "a6bbccab-2bc6-47d9-b1d1-896c661112d8" (UID: "a6bbccab-2bc6-47d9-b1d1-896c661112d8"). InnerVolumeSpecName "kube-api-access-5f2lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.722223 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f2lz\" (UniqueName: \"kubernetes.io/projected/a6bbccab-2bc6-47d9-b1d1-896c661112d8-kube-api-access-5f2lz\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.883405 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.892579 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4" Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.897181 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.917418 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 24 03:15:02 crc kubenswrapper[4923]: E0224 03:15:02.918003 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8812730-dcd0-44d4-a795-256a1c1810e4" containerName="collect-profiles" Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.918067 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8812730-dcd0-44d4-a795-256a1c1810e4" containerName="collect-profiles" Feb 24 03:15:02 crc kubenswrapper[4923]: E0224 03:15:02.918122 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6bbccab-2bc6-47d9-b1d1-896c661112d8" containerName="kube-state-metrics" Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.918181 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6bbccab-2bc6-47d9-b1d1-896c661112d8" containerName="kube-state-metrics" Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.918425 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8812730-dcd0-44d4-a795-256a1c1810e4" containerName="collect-profiles" Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.918506 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6bbccab-2bc6-47d9-b1d1-896c661112d8" containerName="kube-state-metrics" Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.919127 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.922127 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.922385 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 24 03:15:02 crc kubenswrapper[4923]: I0224 03:15:02.927833 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.026178 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8812730-dcd0-44d4-a795-256a1c1810e4-secret-volume\") pod \"e8812730-dcd0-44d4-a795-256a1c1810e4\" (UID: \"e8812730-dcd0-44d4-a795-256a1c1810e4\") " Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.026261 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5mrl\" (UniqueName: \"kubernetes.io/projected/e8812730-dcd0-44d4-a795-256a1c1810e4-kube-api-access-x5mrl\") pod \"e8812730-dcd0-44d4-a795-256a1c1810e4\" (UID: \"e8812730-dcd0-44d4-a795-256a1c1810e4\") " Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.026308 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8812730-dcd0-44d4-a795-256a1c1810e4-config-volume\") pod \"e8812730-dcd0-44d4-a795-256a1c1810e4\" (UID: \"e8812730-dcd0-44d4-a795-256a1c1810e4\") " Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.026712 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18af060f-9e29-435c-82a9-6bdd59867a46-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"18af060f-9e29-435c-82a9-6bdd59867a46\") " pod="openstack/kube-state-metrics-0" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.026747 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/18af060f-9e29-435c-82a9-6bdd59867a46-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"18af060f-9e29-435c-82a9-6bdd59867a46\") " pod="openstack/kube-state-metrics-0" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.026772 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8nzv\" (UniqueName: \"kubernetes.io/projected/18af060f-9e29-435c-82a9-6bdd59867a46-kube-api-access-v8nzv\") pod \"kube-state-metrics-0\" (UID: \"18af060f-9e29-435c-82a9-6bdd59867a46\") " pod="openstack/kube-state-metrics-0" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.026807 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/18af060f-9e29-435c-82a9-6bdd59867a46-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"18af060f-9e29-435c-82a9-6bdd59867a46\") " pod="openstack/kube-state-metrics-0" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.027423 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8812730-dcd0-44d4-a795-256a1c1810e4-config-volume" (OuterVolumeSpecName: "config-volume") pod "e8812730-dcd0-44d4-a795-256a1c1810e4" (UID: "e8812730-dcd0-44d4-a795-256a1c1810e4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.030725 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8812730-dcd0-44d4-a795-256a1c1810e4-kube-api-access-x5mrl" (OuterVolumeSpecName: "kube-api-access-x5mrl") pod "e8812730-dcd0-44d4-a795-256a1c1810e4" (UID: "e8812730-dcd0-44d4-a795-256a1c1810e4"). InnerVolumeSpecName "kube-api-access-x5mrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.031127 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8812730-dcd0-44d4-a795-256a1c1810e4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e8812730-dcd0-44d4-a795-256a1c1810e4" (UID: "e8812730-dcd0-44d4-a795-256a1c1810e4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.128263 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18af060f-9e29-435c-82a9-6bdd59867a46-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"18af060f-9e29-435c-82a9-6bdd59867a46\") " pod="openstack/kube-state-metrics-0" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.128442 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/18af060f-9e29-435c-82a9-6bdd59867a46-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"18af060f-9e29-435c-82a9-6bdd59867a46\") " pod="openstack/kube-state-metrics-0" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.128558 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8nzv\" (UniqueName: \"kubernetes.io/projected/18af060f-9e29-435c-82a9-6bdd59867a46-kube-api-access-v8nzv\") pod \"kube-state-metrics-0\" (UID: \"18af060f-9e29-435c-82a9-6bdd59867a46\") " pod="openstack/kube-state-metrics-0" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.128684 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/18af060f-9e29-435c-82a9-6bdd59867a46-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"18af060f-9e29-435c-82a9-6bdd59867a46\") " pod="openstack/kube-state-metrics-0" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.129240 4923 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8812730-dcd0-44d4-a795-256a1c1810e4-config-volume\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.129372 4923 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8812730-dcd0-44d4-a795-256a1c1810e4-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.129469 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5mrl\" (UniqueName: \"kubernetes.io/projected/e8812730-dcd0-44d4-a795-256a1c1810e4-kube-api-access-x5mrl\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.133451 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/18af060f-9e29-435c-82a9-6bdd59867a46-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"18af060f-9e29-435c-82a9-6bdd59867a46\") " pod="openstack/kube-state-metrics-0" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.133550 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/18af060f-9e29-435c-82a9-6bdd59867a46-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"18af060f-9e29-435c-82a9-6bdd59867a46\") " pod="openstack/kube-state-metrics-0" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.134617 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18af060f-9e29-435c-82a9-6bdd59867a46-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"18af060f-9e29-435c-82a9-6bdd59867a46\") " pod="openstack/kube-state-metrics-0" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.147048 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8nzv\" (UniqueName: \"kubernetes.io/projected/18af060f-9e29-435c-82a9-6bdd59867a46-kube-api-access-v8nzv\") pod \"kube-state-metrics-0\" (UID: \"18af060f-9e29-435c-82a9-6bdd59867a46\") " pod="openstack/kube-state-metrics-0" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.241205 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.509662 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.561723 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"18af060f-9e29-435c-82a9-6bdd59867a46","Type":"ContainerStarted","Data":"ba10fad30de800d25dd4cc4399f0dccc14b42193f6b31607972a88f0c065d4a9"} Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.564851 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.565433 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4" event={"ID":"e8812730-dcd0-44d4-a795-256a1c1810e4","Type":"ContainerDied","Data":"bf526c6aa2f6f47c58e773f992a829185d1604f976c93c2c4736494ebb74639e"} Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.566012 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf526c6aa2f6f47c58e773f992a829185d1604f976c93c2c4736494ebb74639e" Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.567103 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e64699d-3b40-48c4-822f-f7d6e324aa55","Type":"ContainerStarted","Data":"0c5c44843208dafdbbdb9b48fc0673af79ea94722b20351d71b96e17764bebb9"} Feb 24 03:15:03 crc kubenswrapper[4923]: I0224 03:15:03.727209 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6bbccab-2bc6-47d9-b1d1-896c661112d8" path="/var/lib/kubelet/pods/a6bbccab-2bc6-47d9-b1d1-896c661112d8/volumes" Feb 24 03:15:03 crc kubenswrapper[4923]: E0224 03:15:03.928001 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 24 03:15:03 crc kubenswrapper[4923]: E0224 03:15:03.931181 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 24 03:15:03 crc kubenswrapper[4923]: E0224 03:15:03.938936 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 24 03:15:03 crc kubenswrapper[4923]: E0224 03:15:03.939005 4923 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="27299c6d-01f7-467b-a3b0-2456e190670e" containerName="nova-cell0-conductor-conductor" Feb 24 03:15:04 crc kubenswrapper[4923]: I0224 03:15:04.297618 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:15:04 crc kubenswrapper[4923]: I0224 03:15:04.577846 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e64699d-3b40-48c4-822f-f7d6e324aa55","Type":"ContainerStarted","Data":"d0f3890476f78a56be7c5feb3ca802b6427f2169e6305913f4be7cbb68a110e5"} Feb 24 03:15:04 crc kubenswrapper[4923]: I0224 03:15:04.577893 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e64699d-3b40-48c4-822f-f7d6e324aa55","Type":"ContainerStarted","Data":"99747dd440ab2cabbf776aeb2bad23e8790933282c641c069cc39b58bb34ad28"} Feb 24 03:15:04 crc kubenswrapper[4923]: I0224 03:15:04.580309 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"18af060f-9e29-435c-82a9-6bdd59867a46","Type":"ContainerStarted","Data":"68e844e396a5f65fc1017ab5769fe156899c1b04706a563359904a501b6398a2"} Feb 24 03:15:04 crc kubenswrapper[4923]: I0224 03:15:04.580476 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 24 03:15:07 crc kubenswrapper[4923]: I0224 03:15:07.617822 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e64699d-3b40-48c4-822f-f7d6e324aa55","Type":"ContainerStarted","Data":"9b001c67029ea8dfdc87d8a006551f3ee5e4444050c02b591ed92cb4ffeaa24e"} Feb 24 03:15:07 crc kubenswrapper[4923]: I0224 03:15:07.618462 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 24 03:15:07 crc kubenswrapper[4923]: I0224 03:15:07.618105 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerName="ceilometer-notification-agent" containerID="cri-o://99747dd440ab2cabbf776aeb2bad23e8790933282c641c069cc39b58bb34ad28" gracePeriod=30 Feb 24 03:15:07 crc kubenswrapper[4923]: I0224 03:15:07.618012 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerName="ceilometer-central-agent" containerID="cri-o://0c5c44843208dafdbbdb9b48fc0673af79ea94722b20351d71b96e17764bebb9" gracePeriod=30 Feb 24 03:15:07 crc kubenswrapper[4923]: I0224 03:15:07.618115 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerName="sg-core" containerID="cri-o://d0f3890476f78a56be7c5feb3ca802b6427f2169e6305913f4be7cbb68a110e5" gracePeriod=30 Feb 24 03:15:07 crc kubenswrapper[4923]: I0224 03:15:07.618150 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerName="proxy-httpd" containerID="cri-o://9b001c67029ea8dfdc87d8a006551f3ee5e4444050c02b591ed92cb4ffeaa24e" gracePeriod=30 Feb 24 03:15:07 crc kubenswrapper[4923]: I0224 03:15:07.648554 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.259918742 podStartE2EDuration="6.648530427s" podCreationTimestamp="2026-02-24 03:15:01 +0000 UTC" firstStartedPulling="2026-02-24 03:15:02.416105356 +0000 UTC m=+1226.433176169" lastFinishedPulling="2026-02-24 03:15:06.804717031 +0000 UTC m=+1230.821787854" observedRunningTime="2026-02-24 03:15:07.645400367 +0000 UTC m=+1231.662471180" watchObservedRunningTime="2026-02-24 03:15:07.648530427 +0000 UTC m=+1231.665601260" Feb 24 03:15:07 crc kubenswrapper[4923]: I0224 03:15:07.658201 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=5.119625657 podStartE2EDuration="5.658184816s" podCreationTimestamp="2026-02-24 03:15:02 +0000 UTC" firstStartedPulling="2026-02-24 03:15:03.524632805 +0000 UTC m=+1227.541703608" lastFinishedPulling="2026-02-24 03:15:04.063191954 +0000 UTC m=+1228.080262767" observedRunningTime="2026-02-24 03:15:04.602125232 +0000 UTC m=+1228.619196045" watchObservedRunningTime="2026-02-24 03:15:07.658184816 +0000 UTC m=+1231.675255629" Feb 24 03:15:08 crc kubenswrapper[4923]: I0224 03:15:08.628120 4923 generic.go:334] "Generic (PLEG): container finished" podID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerID="9b001c67029ea8dfdc87d8a006551f3ee5e4444050c02b591ed92cb4ffeaa24e" exitCode=0 Feb 24 03:15:08 crc kubenswrapper[4923]: I0224 03:15:08.628403 4923 generic.go:334] "Generic (PLEG): container finished" podID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerID="d0f3890476f78a56be7c5feb3ca802b6427f2169e6305913f4be7cbb68a110e5" exitCode=2 Feb 24 03:15:08 crc kubenswrapper[4923]: I0224 03:15:08.628412 4923 generic.go:334] "Generic (PLEG): container finished" podID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerID="99747dd440ab2cabbf776aeb2bad23e8790933282c641c069cc39b58bb34ad28" exitCode=0 Feb 24 03:15:08 crc kubenswrapper[4923]: I0224 03:15:08.628207 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e64699d-3b40-48c4-822f-f7d6e324aa55","Type":"ContainerDied","Data":"9b001c67029ea8dfdc87d8a006551f3ee5e4444050c02b591ed92cb4ffeaa24e"} Feb 24 03:15:08 crc kubenswrapper[4923]: I0224 03:15:08.628441 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e64699d-3b40-48c4-822f-f7d6e324aa55","Type":"ContainerDied","Data":"d0f3890476f78a56be7c5feb3ca802b6427f2169e6305913f4be7cbb68a110e5"} Feb 24 03:15:08 crc kubenswrapper[4923]: I0224 03:15:08.628452 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e64699d-3b40-48c4-822f-f7d6e324aa55","Type":"ContainerDied","Data":"99747dd440ab2cabbf776aeb2bad23e8790933282c641c069cc39b58bb34ad28"} Feb 24 03:15:08 crc kubenswrapper[4923]: E0224 03:15:08.923583 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 24 03:15:08 crc kubenswrapper[4923]: E0224 03:15:08.925525 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 24 03:15:08 crc kubenswrapper[4923]: E0224 03:15:08.926809 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 24 03:15:08 crc kubenswrapper[4923]: E0224 03:15:08.926885 4923 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="27299c6d-01f7-467b-a3b0-2456e190670e" containerName="nova-cell0-conductor-conductor" Feb 24 03:15:10 crc kubenswrapper[4923]: I0224 03:15:10.648663 4923 generic.go:334] "Generic (PLEG): container finished" podID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerID="0c5c44843208dafdbbdb9b48fc0673af79ea94722b20351d71b96e17764bebb9" exitCode=0 Feb 24 03:15:10 crc kubenswrapper[4923]: I0224 03:15:10.648863 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e64699d-3b40-48c4-822f-f7d6e324aa55","Type":"ContainerDied","Data":"0c5c44843208dafdbbdb9b48fc0673af79ea94722b20351d71b96e17764bebb9"} Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.001796 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.123547 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm8mz\" (UniqueName: \"kubernetes.io/projected/0e64699d-3b40-48c4-822f-f7d6e324aa55-kube-api-access-sm8mz\") pod \"0e64699d-3b40-48c4-822f-f7d6e324aa55\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.123628 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e64699d-3b40-48c4-822f-f7d6e324aa55-log-httpd\") pod \"0e64699d-3b40-48c4-822f-f7d6e324aa55\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.123667 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-combined-ca-bundle\") pod \"0e64699d-3b40-48c4-822f-f7d6e324aa55\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.123707 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-config-data\") pod \"0e64699d-3b40-48c4-822f-f7d6e324aa55\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.123736 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e64699d-3b40-48c4-822f-f7d6e324aa55-run-httpd\") pod \"0e64699d-3b40-48c4-822f-f7d6e324aa55\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.123762 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-sg-core-conf-yaml\") pod \"0e64699d-3b40-48c4-822f-f7d6e324aa55\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.123783 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-scripts\") pod \"0e64699d-3b40-48c4-822f-f7d6e324aa55\" (UID: \"0e64699d-3b40-48c4-822f-f7d6e324aa55\") " Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.124208 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e64699d-3b40-48c4-822f-f7d6e324aa55-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0e64699d-3b40-48c4-822f-f7d6e324aa55" (UID: "0e64699d-3b40-48c4-822f-f7d6e324aa55"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.124268 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e64699d-3b40-48c4-822f-f7d6e324aa55-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0e64699d-3b40-48c4-822f-f7d6e324aa55" (UID: "0e64699d-3b40-48c4-822f-f7d6e324aa55"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.124392 4923 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e64699d-3b40-48c4-822f-f7d6e324aa55-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.124409 4923 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e64699d-3b40-48c4-822f-f7d6e324aa55-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.129632 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-scripts" (OuterVolumeSpecName: "scripts") pod "0e64699d-3b40-48c4-822f-f7d6e324aa55" (UID: "0e64699d-3b40-48c4-822f-f7d6e324aa55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.131055 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e64699d-3b40-48c4-822f-f7d6e324aa55-kube-api-access-sm8mz" (OuterVolumeSpecName: "kube-api-access-sm8mz") pod "0e64699d-3b40-48c4-822f-f7d6e324aa55" (UID: "0e64699d-3b40-48c4-822f-f7d6e324aa55"). InnerVolumeSpecName "kube-api-access-sm8mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.189539 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0e64699d-3b40-48c4-822f-f7d6e324aa55" (UID: "0e64699d-3b40-48c4-822f-f7d6e324aa55"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.198998 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e64699d-3b40-48c4-822f-f7d6e324aa55" (UID: "0e64699d-3b40-48c4-822f-f7d6e324aa55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.222356 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-config-data" (OuterVolumeSpecName: "config-data") pod "0e64699d-3b40-48c4-822f-f7d6e324aa55" (UID: "0e64699d-3b40-48c4-822f-f7d6e324aa55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.226403 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm8mz\" (UniqueName: \"kubernetes.io/projected/0e64699d-3b40-48c4-822f-f7d6e324aa55-kube-api-access-sm8mz\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.226431 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.226441 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.226449 4923 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.226458 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e64699d-3b40-48c4-822f-f7d6e324aa55-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.660726 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e64699d-3b40-48c4-822f-f7d6e324aa55","Type":"ContainerDied","Data":"00d954a3f6733cf5a20996942a8efe2349aadff6cbb3422eaf74905006f63518"} Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.660780 4923 scope.go:117] "RemoveContainer" containerID="9b001c67029ea8dfdc87d8a006551f3ee5e4444050c02b591ed92cb4ffeaa24e" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.660799 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.739704 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.740042 4923 scope.go:117] "RemoveContainer" containerID="d0f3890476f78a56be7c5feb3ca802b6427f2169e6305913f4be7cbb68a110e5" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.741457 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.760823 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:15:11 crc kubenswrapper[4923]: E0224 03:15:11.761379 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerName="ceilometer-notification-agent" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.761401 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerName="ceilometer-notification-agent" Feb 24 03:15:11 crc kubenswrapper[4923]: E0224 03:15:11.761439 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerName="sg-core" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.761449 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerName="sg-core" Feb 24 03:15:11 crc kubenswrapper[4923]: E0224 03:15:11.761470 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerName="proxy-httpd" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.761481 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerName="proxy-httpd" Feb 24 03:15:11 crc kubenswrapper[4923]: E0224 03:15:11.761496 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerName="ceilometer-central-agent" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.761504 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerName="ceilometer-central-agent" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.761734 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerName="sg-core" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.761762 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerName="ceilometer-notification-agent" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.761784 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerName="ceilometer-central-agent" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.761797 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e64699d-3b40-48c4-822f-f7d6e324aa55" containerName="proxy-httpd" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.764041 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.765747 4923 scope.go:117] "RemoveContainer" containerID="99747dd440ab2cabbf776aeb2bad23e8790933282c641c069cc39b58bb34ad28" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.771125 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.771589 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.771812 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.771829 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.800577 4923 scope.go:117] "RemoveContainer" containerID="0c5c44843208dafdbbdb9b48fc0673af79ea94722b20351d71b96e17764bebb9" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.938353 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-run-httpd\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.938778 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-scripts\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.938863 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-log-httpd\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.938933 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-config-data\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.938966 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nspm\" (UniqueName: \"kubernetes.io/projected/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-kube-api-access-4nspm\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.939021 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.939093 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:11 crc kubenswrapper[4923]: I0224 03:15:11.939174 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.041069 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.041136 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.041163 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.041258 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-run-httpd\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.041290 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-scripts\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.041342 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-log-httpd\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.041378 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-config-data\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.041409 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nspm\" (UniqueName: \"kubernetes.io/projected/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-kube-api-access-4nspm\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.042925 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-log-httpd\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.043133 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-run-httpd\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.045802 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.048864 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.051258 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-config-data\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.054438 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-scripts\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.062019 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nspm\" (UniqueName: \"kubernetes.io/projected/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-kube-api-access-4nspm\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.070143 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " pod="openstack/ceilometer-0" Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.090193 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.592879 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:15:12 crc kubenswrapper[4923]: I0224 03:15:12.673572 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"627ea57c-55e7-43fc-ab33-51ab2e7a7e80","Type":"ContainerStarted","Data":"01c44f6622c885c0d0829cb871c133986f8ee3b7d834a544b6500ac29868c7eb"} Feb 24 03:15:13 crc kubenswrapper[4923]: I0224 03:15:13.250110 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 24 03:15:13 crc kubenswrapper[4923]: I0224 03:15:13.684738 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"627ea57c-55e7-43fc-ab33-51ab2e7a7e80","Type":"ContainerStarted","Data":"33c265621497e15b63b49367e1e4f35deab97b0b1060f46f24f73796bda2e128"} Feb 24 03:15:13 crc kubenswrapper[4923]: I0224 03:15:13.726357 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e64699d-3b40-48c4-822f-f7d6e324aa55" path="/var/lib/kubelet/pods/0e64699d-3b40-48c4-822f-f7d6e324aa55/volumes" Feb 24 03:15:13 crc kubenswrapper[4923]: E0224 03:15:13.924840 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 24 03:15:13 crc kubenswrapper[4923]: E0224 03:15:13.926439 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 24 03:15:13 crc kubenswrapper[4923]: E0224 03:15:13.937821 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 24 03:15:13 crc kubenswrapper[4923]: E0224 03:15:13.937882 4923 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="27299c6d-01f7-467b-a3b0-2456e190670e" containerName="nova-cell0-conductor-conductor" Feb 24 03:15:14 crc kubenswrapper[4923]: I0224 03:15:14.693074 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"627ea57c-55e7-43fc-ab33-51ab2e7a7e80","Type":"ContainerStarted","Data":"07c7c699b11ffa84fc823982a34336df0a9616748438f269f4aa71442da27352"} Feb 24 03:15:14 crc kubenswrapper[4923]: I0224 03:15:14.693313 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"627ea57c-55e7-43fc-ab33-51ab2e7a7e80","Type":"ContainerStarted","Data":"1186ac07f85c636118a322753084290e3301c3e363a6f4b7ef454e061037f398"} Feb 24 03:15:16 crc kubenswrapper[4923]: I0224 03:15:16.740135 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"627ea57c-55e7-43fc-ab33-51ab2e7a7e80","Type":"ContainerStarted","Data":"91d81246b73c0225957b1bd2d65d5fe5228e087a38e9f21a5abb232973036ef8"} Feb 24 03:15:16 crc kubenswrapper[4923]: I0224 03:15:16.740819 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 24 03:15:16 crc kubenswrapper[4923]: I0224 03:15:16.773150 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.907454194 podStartE2EDuration="5.77312995s" podCreationTimestamp="2026-02-24 03:15:11 +0000 UTC" firstStartedPulling="2026-02-24 03:15:12.610233127 +0000 UTC m=+1236.627303940" lastFinishedPulling="2026-02-24 03:15:16.475908883 +0000 UTC m=+1240.492979696" observedRunningTime="2026-02-24 03:15:16.767998318 +0000 UTC m=+1240.785069141" watchObservedRunningTime="2026-02-24 03:15:16.77312995 +0000 UTC m=+1240.790200783" Feb 24 03:15:18 crc kubenswrapper[4923]: E0224 03:15:18.923914 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 24 03:15:18 crc kubenswrapper[4923]: E0224 03:15:18.925911 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 24 03:15:18 crc kubenswrapper[4923]: E0224 03:15:18.927140 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 24 03:15:18 crc kubenswrapper[4923]: E0224 03:15:18.927187 4923 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="27299c6d-01f7-467b-a3b0-2456e190670e" containerName="nova-cell0-conductor-conductor" Feb 24 03:15:23 crc kubenswrapper[4923]: E0224 03:15:23.923983 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 24 03:15:23 crc kubenswrapper[4923]: E0224 03:15:23.928698 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 24 03:15:23 crc kubenswrapper[4923]: E0224 03:15:23.930022 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 24 03:15:23 crc kubenswrapper[4923]: E0224 03:15:23.930084 4923 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="27299c6d-01f7-467b-a3b0-2456e190670e" containerName="nova-cell0-conductor-conductor" Feb 24 03:15:27 crc kubenswrapper[4923]: I0224 03:15:27.838458 4923 generic.go:334] "Generic (PLEG): container finished" podID="27299c6d-01f7-467b-a3b0-2456e190670e" containerID="d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084" exitCode=137 Feb 24 03:15:27 crc kubenswrapper[4923]: I0224 03:15:27.839038 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"27299c6d-01f7-467b-a3b0-2456e190670e","Type":"ContainerDied","Data":"d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084"} Feb 24 03:15:27 crc kubenswrapper[4923]: I0224 03:15:27.839068 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"27299c6d-01f7-467b-a3b0-2456e190670e","Type":"ContainerDied","Data":"6b0b84588ec053b8975caee1b6cbe843ae2209ab63ed139e2e8557d8ae572405"} Feb 24 03:15:27 crc kubenswrapper[4923]: I0224 03:15:27.839084 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b0b84588ec053b8975caee1b6cbe843ae2209ab63ed139e2e8557d8ae572405" Feb 24 03:15:27 crc kubenswrapper[4923]: I0224 03:15:27.870158 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 24 03:15:27 crc kubenswrapper[4923]: I0224 03:15:27.945841 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8n56\" (UniqueName: \"kubernetes.io/projected/27299c6d-01f7-467b-a3b0-2456e190670e-kube-api-access-f8n56\") pod \"27299c6d-01f7-467b-a3b0-2456e190670e\" (UID: \"27299c6d-01f7-467b-a3b0-2456e190670e\") " Feb 24 03:15:27 crc kubenswrapper[4923]: I0224 03:15:27.945950 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27299c6d-01f7-467b-a3b0-2456e190670e-combined-ca-bundle\") pod \"27299c6d-01f7-467b-a3b0-2456e190670e\" (UID: \"27299c6d-01f7-467b-a3b0-2456e190670e\") " Feb 24 03:15:27 crc kubenswrapper[4923]: I0224 03:15:27.946114 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27299c6d-01f7-467b-a3b0-2456e190670e-config-data\") pod \"27299c6d-01f7-467b-a3b0-2456e190670e\" (UID: \"27299c6d-01f7-467b-a3b0-2456e190670e\") " Feb 24 03:15:27 crc kubenswrapper[4923]: I0224 03:15:27.951798 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27299c6d-01f7-467b-a3b0-2456e190670e-kube-api-access-f8n56" (OuterVolumeSpecName: "kube-api-access-f8n56") pod "27299c6d-01f7-467b-a3b0-2456e190670e" (UID: "27299c6d-01f7-467b-a3b0-2456e190670e"). InnerVolumeSpecName "kube-api-access-f8n56". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:15:27 crc kubenswrapper[4923]: I0224 03:15:27.976185 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27299c6d-01f7-467b-a3b0-2456e190670e-config-data" (OuterVolumeSpecName: "config-data") pod "27299c6d-01f7-467b-a3b0-2456e190670e" (UID: "27299c6d-01f7-467b-a3b0-2456e190670e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:27 crc kubenswrapper[4923]: I0224 03:15:27.976781 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27299c6d-01f7-467b-a3b0-2456e190670e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27299c6d-01f7-467b-a3b0-2456e190670e" (UID: "27299c6d-01f7-467b-a3b0-2456e190670e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:28 crc kubenswrapper[4923]: I0224 03:15:28.048241 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27299c6d-01f7-467b-a3b0-2456e190670e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:28 crc kubenswrapper[4923]: I0224 03:15:28.048319 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27299c6d-01f7-467b-a3b0-2456e190670e-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:28 crc kubenswrapper[4923]: I0224 03:15:28.048329 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8n56\" (UniqueName: \"kubernetes.io/projected/27299c6d-01f7-467b-a3b0-2456e190670e-kube-api-access-f8n56\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:28 crc kubenswrapper[4923]: I0224 03:15:28.848917 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 24 03:15:28 crc kubenswrapper[4923]: I0224 03:15:28.912778 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 24 03:15:28 crc kubenswrapper[4923]: I0224 03:15:28.932416 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 24 03:15:28 crc kubenswrapper[4923]: I0224 03:15:28.952976 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 24 03:15:28 crc kubenswrapper[4923]: E0224 03:15:28.953380 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27299c6d-01f7-467b-a3b0-2456e190670e" containerName="nova-cell0-conductor-conductor" Feb 24 03:15:28 crc kubenswrapper[4923]: I0224 03:15:28.953423 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="27299c6d-01f7-467b-a3b0-2456e190670e" containerName="nova-cell0-conductor-conductor" Feb 24 03:15:28 crc kubenswrapper[4923]: I0224 03:15:28.953645 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="27299c6d-01f7-467b-a3b0-2456e190670e" containerName="nova-cell0-conductor-conductor" Feb 24 03:15:28 crc kubenswrapper[4923]: I0224 03:15:28.954235 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 24 03:15:28 crc kubenswrapper[4923]: I0224 03:15:28.956003 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 24 03:15:28 crc kubenswrapper[4923]: I0224 03:15:28.957481 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gg5bz" Feb 24 03:15:28 crc kubenswrapper[4923]: I0224 03:15:28.962659 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 24 03:15:29 crc kubenswrapper[4923]: I0224 03:15:29.075854 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mws2f\" (UniqueName: \"kubernetes.io/projected/5c079676-c0fa-49dd-94fe-360388e5014d-kube-api-access-mws2f\") pod \"nova-cell0-conductor-0\" (UID: \"5c079676-c0fa-49dd-94fe-360388e5014d\") " pod="openstack/nova-cell0-conductor-0" Feb 24 03:15:29 crc kubenswrapper[4923]: I0224 03:15:29.076289 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c079676-c0fa-49dd-94fe-360388e5014d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5c079676-c0fa-49dd-94fe-360388e5014d\") " pod="openstack/nova-cell0-conductor-0" Feb 24 03:15:29 crc kubenswrapper[4923]: I0224 03:15:29.076415 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c079676-c0fa-49dd-94fe-360388e5014d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5c079676-c0fa-49dd-94fe-360388e5014d\") " pod="openstack/nova-cell0-conductor-0" Feb 24 03:15:29 crc kubenswrapper[4923]: I0224 03:15:29.178618 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mws2f\" (UniqueName: \"kubernetes.io/projected/5c079676-c0fa-49dd-94fe-360388e5014d-kube-api-access-mws2f\") pod \"nova-cell0-conductor-0\" (UID: \"5c079676-c0fa-49dd-94fe-360388e5014d\") " pod="openstack/nova-cell0-conductor-0" Feb 24 03:15:29 crc kubenswrapper[4923]: I0224 03:15:29.178697 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c079676-c0fa-49dd-94fe-360388e5014d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5c079676-c0fa-49dd-94fe-360388e5014d\") " pod="openstack/nova-cell0-conductor-0" Feb 24 03:15:29 crc kubenswrapper[4923]: I0224 03:15:29.178868 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c079676-c0fa-49dd-94fe-360388e5014d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5c079676-c0fa-49dd-94fe-360388e5014d\") " pod="openstack/nova-cell0-conductor-0" Feb 24 03:15:29 crc kubenswrapper[4923]: I0224 03:15:29.188106 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c079676-c0fa-49dd-94fe-360388e5014d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5c079676-c0fa-49dd-94fe-360388e5014d\") " pod="openstack/nova-cell0-conductor-0" Feb 24 03:15:29 crc kubenswrapper[4923]: I0224 03:15:29.193270 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c079676-c0fa-49dd-94fe-360388e5014d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5c079676-c0fa-49dd-94fe-360388e5014d\") " pod="openstack/nova-cell0-conductor-0" Feb 24 03:15:29 crc kubenswrapper[4923]: I0224 03:15:29.199970 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mws2f\" (UniqueName: \"kubernetes.io/projected/5c079676-c0fa-49dd-94fe-360388e5014d-kube-api-access-mws2f\") pod \"nova-cell0-conductor-0\" (UID: \"5c079676-c0fa-49dd-94fe-360388e5014d\") " pod="openstack/nova-cell0-conductor-0" Feb 24 03:15:29 crc kubenswrapper[4923]: I0224 03:15:29.275378 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 24 03:15:29 crc kubenswrapper[4923]: I0224 03:15:29.726570 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27299c6d-01f7-467b-a3b0-2456e190670e" path="/var/lib/kubelet/pods/27299c6d-01f7-467b-a3b0-2456e190670e/volumes" Feb 24 03:15:29 crc kubenswrapper[4923]: I0224 03:15:29.759467 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 24 03:15:29 crc kubenswrapper[4923]: W0224 03:15:29.765240 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c079676_c0fa_49dd_94fe_360388e5014d.slice/crio-96b4266f998d49fe4fb631cfc307f43885f88899b13c0fbaf444f6b0e7dc7c11 WatchSource:0}: Error finding container 96b4266f998d49fe4fb631cfc307f43885f88899b13c0fbaf444f6b0e7dc7c11: Status 404 returned error can't find the container with id 96b4266f998d49fe4fb631cfc307f43885f88899b13c0fbaf444f6b0e7dc7c11 Feb 24 03:15:29 crc kubenswrapper[4923]: I0224 03:15:29.857712 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5c079676-c0fa-49dd-94fe-360388e5014d","Type":"ContainerStarted","Data":"96b4266f998d49fe4fb631cfc307f43885f88899b13c0fbaf444f6b0e7dc7c11"} Feb 24 03:15:30 crc kubenswrapper[4923]: I0224 03:15:30.869787 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5c079676-c0fa-49dd-94fe-360388e5014d","Type":"ContainerStarted","Data":"18be73183ee364982ab06632b853fb0084abc468946b6a81ff258e4a176b970b"} Feb 24 03:15:30 crc kubenswrapper[4923]: I0224 03:15:30.870292 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 24 03:15:30 crc kubenswrapper[4923]: I0224 03:15:30.896006 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.895979534 podStartE2EDuration="2.895979534s" podCreationTimestamp="2026-02-24 03:15:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:15:30.89039643 +0000 UTC m=+1254.907467263" watchObservedRunningTime="2026-02-24 03:15:30.895979534 +0000 UTC m=+1254.913050347" Feb 24 03:15:34 crc kubenswrapper[4923]: I0224 03:15:34.316043 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 24 03:15:34 crc kubenswrapper[4923]: I0224 03:15:34.840310 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-clrmb"] Feb 24 03:15:34 crc kubenswrapper[4923]: I0224 03:15:34.841652 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-clrmb" Feb 24 03:15:34 crc kubenswrapper[4923]: I0224 03:15:34.848011 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 24 03:15:34 crc kubenswrapper[4923]: I0224 03:15:34.848630 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 24 03:15:34 crc kubenswrapper[4923]: I0224 03:15:34.851257 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-clrmb"] Feb 24 03:15:34 crc kubenswrapper[4923]: I0224 03:15:34.998569 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/557d4e5b-4b4a-4eee-b199-533822f52b8f-scripts\") pod \"nova-cell0-cell-mapping-clrmb\" (UID: \"557d4e5b-4b4a-4eee-b199-533822f52b8f\") " pod="openstack/nova-cell0-cell-mapping-clrmb" Feb 24 03:15:34 crc kubenswrapper[4923]: I0224 03:15:34.998617 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/557d4e5b-4b4a-4eee-b199-533822f52b8f-config-data\") pod \"nova-cell0-cell-mapping-clrmb\" (UID: \"557d4e5b-4b4a-4eee-b199-533822f52b8f\") " pod="openstack/nova-cell0-cell-mapping-clrmb" Feb 24 03:15:34 crc kubenswrapper[4923]: I0224 03:15:34.998654 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/557d4e5b-4b4a-4eee-b199-533822f52b8f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-clrmb\" (UID: \"557d4e5b-4b4a-4eee-b199-533822f52b8f\") " pod="openstack/nova-cell0-cell-mapping-clrmb" Feb 24 03:15:34 crc kubenswrapper[4923]: I0224 03:15:34.998799 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w4kz\" (UniqueName: \"kubernetes.io/projected/557d4e5b-4b4a-4eee-b199-533822f52b8f-kube-api-access-2w4kz\") pod \"nova-cell0-cell-mapping-clrmb\" (UID: \"557d4e5b-4b4a-4eee-b199-533822f52b8f\") " pod="openstack/nova-cell0-cell-mapping-clrmb" Feb 24 03:15:34 crc kubenswrapper[4923]: I0224 03:15:34.999364 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.000421 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.002677 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.016061 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.085205 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.086596 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.093244 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.101144 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/557d4e5b-4b4a-4eee-b199-533822f52b8f-scripts\") pod \"nova-cell0-cell-mapping-clrmb\" (UID: \"557d4e5b-4b4a-4eee-b199-533822f52b8f\") " pod="openstack/nova-cell0-cell-mapping-clrmb" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.101196 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/557d4e5b-4b4a-4eee-b199-533822f52b8f-config-data\") pod \"nova-cell0-cell-mapping-clrmb\" (UID: \"557d4e5b-4b4a-4eee-b199-533822f52b8f\") " pod="openstack/nova-cell0-cell-mapping-clrmb" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.101232 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/557d4e5b-4b4a-4eee-b199-533822f52b8f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-clrmb\" (UID: \"557d4e5b-4b4a-4eee-b199-533822f52b8f\") " pod="openstack/nova-cell0-cell-mapping-clrmb" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.101256 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w4kz\" (UniqueName: \"kubernetes.io/projected/557d4e5b-4b4a-4eee-b199-533822f52b8f-kube-api-access-2w4kz\") pod \"nova-cell0-cell-mapping-clrmb\" (UID: \"557d4e5b-4b4a-4eee-b199-533822f52b8f\") " pod="openstack/nova-cell0-cell-mapping-clrmb" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.101380 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.101413 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.101449 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn849\" (UniqueName: \"kubernetes.io/projected/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe-kube-api-access-jn849\") pod \"nova-cell1-novncproxy-0\" (UID: \"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.106815 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/557d4e5b-4b4a-4eee-b199-533822f52b8f-scripts\") pod \"nova-cell0-cell-mapping-clrmb\" (UID: \"557d4e5b-4b4a-4eee-b199-533822f52b8f\") " pod="openstack/nova-cell0-cell-mapping-clrmb" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.112651 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/557d4e5b-4b4a-4eee-b199-533822f52b8f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-clrmb\" (UID: \"557d4e5b-4b4a-4eee-b199-533822f52b8f\") " pod="openstack/nova-cell0-cell-mapping-clrmb" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.115470 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/557d4e5b-4b4a-4eee-b199-533822f52b8f-config-data\") pod \"nova-cell0-cell-mapping-clrmb\" (UID: \"557d4e5b-4b4a-4eee-b199-533822f52b8f\") " pod="openstack/nova-cell0-cell-mapping-clrmb" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.129358 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.155843 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w4kz\" (UniqueName: \"kubernetes.io/projected/557d4e5b-4b4a-4eee-b199-533822f52b8f-kube-api-access-2w4kz\") pod \"nova-cell0-cell-mapping-clrmb\" (UID: \"557d4e5b-4b4a-4eee-b199-533822f52b8f\") " pod="openstack/nova-cell0-cell-mapping-clrmb" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.169769 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-clrmb" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.203885 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc908a6f-cd0f-401c-a719-4663d10bae28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cc908a6f-cd0f-401c-a719-4663d10bae28\") " pod="openstack/nova-metadata-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.203945 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc908a6f-cd0f-401c-a719-4663d10bae28-config-data\") pod \"nova-metadata-0\" (UID: \"cc908a6f-cd0f-401c-a719-4663d10bae28\") " pod="openstack/nova-metadata-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.203993 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vzml\" (UniqueName: \"kubernetes.io/projected/cc908a6f-cd0f-401c-a719-4663d10bae28-kube-api-access-8vzml\") pod \"nova-metadata-0\" (UID: \"cc908a6f-cd0f-401c-a719-4663d10bae28\") " pod="openstack/nova-metadata-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.204024 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.204060 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.204109 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn849\" (UniqueName: \"kubernetes.io/projected/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe-kube-api-access-jn849\") pod \"nova-cell1-novncproxy-0\" (UID: \"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.204151 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc908a6f-cd0f-401c-a719-4663d10bae28-logs\") pod \"nova-metadata-0\" (UID: \"cc908a6f-cd0f-401c-a719-4663d10bae28\") " pod="openstack/nova-metadata-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.212998 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.216547 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.246738 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.248370 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.248804 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn849\" (UniqueName: \"kubernetes.io/projected/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe-kube-api-access-jn849\") pod \"nova-cell1-novncproxy-0\" (UID: \"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.254118 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.266390 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-9jnnr"] Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.267884 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.298380 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.305406 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc908a6f-cd0f-401c-a719-4663d10bae28-logs\") pod \"nova-metadata-0\" (UID: \"cc908a6f-cd0f-401c-a719-4663d10bae28\") " pod="openstack/nova-metadata-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.305524 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc908a6f-cd0f-401c-a719-4663d10bae28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cc908a6f-cd0f-401c-a719-4663d10bae28\") " pod="openstack/nova-metadata-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.305552 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc908a6f-cd0f-401c-a719-4663d10bae28-config-data\") pod \"nova-metadata-0\" (UID: \"cc908a6f-cd0f-401c-a719-4663d10bae28\") " pod="openstack/nova-metadata-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.305584 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vzml\" (UniqueName: \"kubernetes.io/projected/cc908a6f-cd0f-401c-a719-4663d10bae28-kube-api-access-8vzml\") pod \"nova-metadata-0\" (UID: \"cc908a6f-cd0f-401c-a719-4663d10bae28\") " pod="openstack/nova-metadata-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.306244 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc908a6f-cd0f-401c-a719-4663d10bae28-logs\") pod \"nova-metadata-0\" (UID: \"cc908a6f-cd0f-401c-a719-4663d10bae28\") " pod="openstack/nova-metadata-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.307867 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-9jnnr"] Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.317869 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc908a6f-cd0f-401c-a719-4663d10bae28-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cc908a6f-cd0f-401c-a719-4663d10bae28\") " pod="openstack/nova-metadata-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.319705 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc908a6f-cd0f-401c-a719-4663d10bae28-config-data\") pod \"nova-metadata-0\" (UID: \"cc908a6f-cd0f-401c-a719-4663d10bae28\") " pod="openstack/nova-metadata-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.324197 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.327103 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vzml\" (UniqueName: \"kubernetes.io/projected/cc908a6f-cd0f-401c-a719-4663d10bae28-kube-api-access-8vzml\") pod \"nova-metadata-0\" (UID: \"cc908a6f-cd0f-401c-a719-4663d10bae28\") " pod="openstack/nova-metadata-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.327548 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.330052 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.332449 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.355793 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.408436 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-9jnnr\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.409032 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d1d2544-6a36-4e92-82d1-0cd1f937b362-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0d1d2544-6a36-4e92-82d1-0cd1f937b362\") " pod="openstack/nova-scheduler-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.409220 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-9jnnr\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.409362 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vw22\" (UniqueName: \"kubernetes.io/projected/0d1d2544-6a36-4e92-82d1-0cd1f937b362-kube-api-access-6vw22\") pod \"nova-scheduler-0\" (UID: \"0d1d2544-6a36-4e92-82d1-0cd1f937b362\") " pod="openstack/nova-scheduler-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.414428 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-9jnnr\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.414530 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5npc\" (UniqueName: \"kubernetes.io/projected/e95191de-44e3-4252-9fcf-cf159a81d3c8-kube-api-access-l5npc\") pod \"dnsmasq-dns-bccf8f775-9jnnr\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.414670 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d1d2544-6a36-4e92-82d1-0cd1f937b362-config-data\") pod \"nova-scheduler-0\" (UID: \"0d1d2544-6a36-4e92-82d1-0cd1f937b362\") " pod="openstack/nova-scheduler-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.414738 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-dns-svc\") pod \"dnsmasq-dns-bccf8f775-9jnnr\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.414838 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-config\") pod \"dnsmasq-dns-bccf8f775-9jnnr\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.522255 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8b89\" (UniqueName: \"kubernetes.io/projected/5a092fee-177a-48c6-b7bf-e28195f851f6-kube-api-access-d8b89\") pod \"nova-api-0\" (UID: \"5a092fee-177a-48c6-b7bf-e28195f851f6\") " pod="openstack/nova-api-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.522324 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-config\") pod \"dnsmasq-dns-bccf8f775-9jnnr\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.522550 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-9jnnr\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.522606 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a092fee-177a-48c6-b7bf-e28195f851f6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a092fee-177a-48c6-b7bf-e28195f851f6\") " pod="openstack/nova-api-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.522646 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d1d2544-6a36-4e92-82d1-0cd1f937b362-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0d1d2544-6a36-4e92-82d1-0cd1f937b362\") " pod="openstack/nova-scheduler-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.522706 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-9jnnr\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.522740 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vw22\" (UniqueName: \"kubernetes.io/projected/0d1d2544-6a36-4e92-82d1-0cd1f937b362-kube-api-access-6vw22\") pod \"nova-scheduler-0\" (UID: \"0d1d2544-6a36-4e92-82d1-0cd1f937b362\") " pod="openstack/nova-scheduler-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.522763 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-9jnnr\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.522784 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5npc\" (UniqueName: \"kubernetes.io/projected/e95191de-44e3-4252-9fcf-cf159a81d3c8-kube-api-access-l5npc\") pod \"dnsmasq-dns-bccf8f775-9jnnr\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.522878 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d1d2544-6a36-4e92-82d1-0cd1f937b362-config-data\") pod \"nova-scheduler-0\" (UID: \"0d1d2544-6a36-4e92-82d1-0cd1f937b362\") " pod="openstack/nova-scheduler-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.522911 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a092fee-177a-48c6-b7bf-e28195f851f6-config-data\") pod \"nova-api-0\" (UID: \"5a092fee-177a-48c6-b7bf-e28195f851f6\") " pod="openstack/nova-api-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.522934 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a092fee-177a-48c6-b7bf-e28195f851f6-logs\") pod \"nova-api-0\" (UID: \"5a092fee-177a-48c6-b7bf-e28195f851f6\") " pod="openstack/nova-api-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.522974 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-dns-svc\") pod \"dnsmasq-dns-bccf8f775-9jnnr\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.523092 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-config\") pod \"dnsmasq-dns-bccf8f775-9jnnr\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.523777 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-dns-svc\") pod \"dnsmasq-dns-bccf8f775-9jnnr\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.524016 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-9jnnr\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.524379 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-9jnnr\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.525979 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-9jnnr\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.539260 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d1d2544-6a36-4e92-82d1-0cd1f937b362-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0d1d2544-6a36-4e92-82d1-0cd1f937b362\") " pod="openstack/nova-scheduler-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.543185 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d1d2544-6a36-4e92-82d1-0cd1f937b362-config-data\") pod \"nova-scheduler-0\" (UID: \"0d1d2544-6a36-4e92-82d1-0cd1f937b362\") " pod="openstack/nova-scheduler-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.569530 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5npc\" (UniqueName: \"kubernetes.io/projected/e95191de-44e3-4252-9fcf-cf159a81d3c8-kube-api-access-l5npc\") pod \"dnsmasq-dns-bccf8f775-9jnnr\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.584645 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vw22\" (UniqueName: \"kubernetes.io/projected/0d1d2544-6a36-4e92-82d1-0cd1f937b362-kube-api-access-6vw22\") pod \"nova-scheduler-0\" (UID: \"0d1d2544-6a36-4e92-82d1-0cd1f937b362\") " pod="openstack/nova-scheduler-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.619572 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.632032 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a092fee-177a-48c6-b7bf-e28195f851f6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a092fee-177a-48c6-b7bf-e28195f851f6\") " pod="openstack/nova-api-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.632412 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a092fee-177a-48c6-b7bf-e28195f851f6-config-data\") pod \"nova-api-0\" (UID: \"5a092fee-177a-48c6-b7bf-e28195f851f6\") " pod="openstack/nova-api-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.632479 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a092fee-177a-48c6-b7bf-e28195f851f6-logs\") pod \"nova-api-0\" (UID: \"5a092fee-177a-48c6-b7bf-e28195f851f6\") " pod="openstack/nova-api-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.632564 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8b89\" (UniqueName: \"kubernetes.io/projected/5a092fee-177a-48c6-b7bf-e28195f851f6-kube-api-access-d8b89\") pod \"nova-api-0\" (UID: \"5a092fee-177a-48c6-b7bf-e28195f851f6\") " pod="openstack/nova-api-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.635757 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a092fee-177a-48c6-b7bf-e28195f851f6-logs\") pod \"nova-api-0\" (UID: \"5a092fee-177a-48c6-b7bf-e28195f851f6\") " pod="openstack/nova-api-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.640776 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.641749 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a092fee-177a-48c6-b7bf-e28195f851f6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a092fee-177a-48c6-b7bf-e28195f851f6\") " pod="openstack/nova-api-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.642212 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a092fee-177a-48c6-b7bf-e28195f851f6-config-data\") pod \"nova-api-0\" (UID: \"5a092fee-177a-48c6-b7bf-e28195f851f6\") " pod="openstack/nova-api-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.661721 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8b89\" (UniqueName: \"kubernetes.io/projected/5a092fee-177a-48c6-b7bf-e28195f851f6-kube-api-access-d8b89\") pod \"nova-api-0\" (UID: \"5a092fee-177a-48c6-b7bf-e28195f851f6\") " pod="openstack/nova-api-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.676734 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.711270 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.828695 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-clrmb"] Feb 24 03:15:35 crc kubenswrapper[4923]: I0224 03:15:35.973629 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 24 03:15:35 crc kubenswrapper[4923]: W0224 03:15:35.993663 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd073d59f_dbca_42a7_b04e_9efcb0f2dbfe.slice/crio-f40360b76f3e2c254da69b7cb4b686868237bd72483c0920ad9ae14a250be33f WatchSource:0}: Error finding container f40360b76f3e2c254da69b7cb4b686868237bd72483c0920ad9ae14a250be33f: Status 404 returned error can't find the container with id f40360b76f3e2c254da69b7cb4b686868237bd72483c0920ad9ae14a250be33f Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.087837 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gpqwz"] Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.097841 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gpqwz"] Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.097922 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gpqwz" Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.100947 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.101157 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.156052 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-scripts\") pod \"nova-cell1-conductor-db-sync-gpqwz\" (UID: \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\") " pod="openstack/nova-cell1-conductor-db-sync-gpqwz" Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.156171 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7wnv\" (UniqueName: \"kubernetes.io/projected/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-kube-api-access-c7wnv\") pod \"nova-cell1-conductor-db-sync-gpqwz\" (UID: \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\") " pod="openstack/nova-cell1-conductor-db-sync-gpqwz" Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.156197 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-config-data\") pod \"nova-cell1-conductor-db-sync-gpqwz\" (UID: \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\") " pod="openstack/nova-cell1-conductor-db-sync-gpqwz" Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.156275 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gpqwz\" (UID: \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\") " pod="openstack/nova-cell1-conductor-db-sync-gpqwz" Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.258610 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-scripts\") pod \"nova-cell1-conductor-db-sync-gpqwz\" (UID: \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\") " pod="openstack/nova-cell1-conductor-db-sync-gpqwz" Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.259010 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7wnv\" (UniqueName: \"kubernetes.io/projected/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-kube-api-access-c7wnv\") pod \"nova-cell1-conductor-db-sync-gpqwz\" (UID: \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\") " pod="openstack/nova-cell1-conductor-db-sync-gpqwz" Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.259045 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-config-data\") pod \"nova-cell1-conductor-db-sync-gpqwz\" (UID: \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\") " pod="openstack/nova-cell1-conductor-db-sync-gpqwz" Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.259251 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gpqwz\" (UID: \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\") " pod="openstack/nova-cell1-conductor-db-sync-gpqwz" Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.264012 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gpqwz\" (UID: \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\") " pod="openstack/nova-cell1-conductor-db-sync-gpqwz" Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.264811 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-scripts\") pod \"nova-cell1-conductor-db-sync-gpqwz\" (UID: \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\") " pod="openstack/nova-cell1-conductor-db-sync-gpqwz" Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.266261 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-config-data\") pod \"nova-cell1-conductor-db-sync-gpqwz\" (UID: \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\") " pod="openstack/nova-cell1-conductor-db-sync-gpqwz" Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.275464 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7wnv\" (UniqueName: \"kubernetes.io/projected/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-kube-api-access-c7wnv\") pod \"nova-cell1-conductor-db-sync-gpqwz\" (UID: \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\") " pod="openstack/nova-cell1-conductor-db-sync-gpqwz" Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.300076 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.307772 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.448682 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-9jnnr"] Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.457037 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.493723 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gpqwz" Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.932887 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-clrmb" event={"ID":"557d4e5b-4b4a-4eee-b199-533822f52b8f","Type":"ContainerStarted","Data":"7c9d389928b4202b4e7b7c4b3abaa2bbde602b2a213d7606a60e186324f4751f"} Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.933256 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-clrmb" event={"ID":"557d4e5b-4b4a-4eee-b199-533822f52b8f","Type":"ContainerStarted","Data":"f0fc24e883641edf128c8c3fa20f0c1808e67b66478bb088481f9504474a1cb3"} Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.935728 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a092fee-177a-48c6-b7bf-e28195f851f6","Type":"ContainerStarted","Data":"82bf24e2a00058e3092e93bec5105a8c4b6c5aaca529172226976201b094af09"} Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.938362 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe","Type":"ContainerStarted","Data":"f40360b76f3e2c254da69b7cb4b686868237bd72483c0920ad9ae14a250be33f"} Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.940414 4923 generic.go:334] "Generic (PLEG): container finished" podID="e95191de-44e3-4252-9fcf-cf159a81d3c8" containerID="9dc420c569360ec65f0cddca7a11509c70744e03c03377e7607febc12afecf44" exitCode=0 Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.940468 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" event={"ID":"e95191de-44e3-4252-9fcf-cf159a81d3c8","Type":"ContainerDied","Data":"9dc420c569360ec65f0cddca7a11509c70744e03c03377e7607febc12afecf44"} Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.940513 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" event={"ID":"e95191de-44e3-4252-9fcf-cf159a81d3c8","Type":"ContainerStarted","Data":"5885fd7b7042c31cd6e77bb54453d5cec1077a75799e2c768c069b4e3bdfff55"} Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.941564 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc908a6f-cd0f-401c-a719-4663d10bae28","Type":"ContainerStarted","Data":"3f16dcfe0ebb7666e8c1cf8d94e71dd97c45320e9a21877b77747579b6d766a1"} Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.942772 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0d1d2544-6a36-4e92-82d1-0cd1f937b362","Type":"ContainerStarted","Data":"8f76b31ccb4c569882f79030c8805a86876213c3d7d974929cf38a0d28953b10"} Feb 24 03:15:36 crc kubenswrapper[4923]: I0224 03:15:36.960878 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-clrmb" podStartSLOduration=2.96085649 podStartE2EDuration="2.96085649s" podCreationTimestamp="2026-02-24 03:15:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:15:36.949256822 +0000 UTC m=+1260.966327625" watchObservedRunningTime="2026-02-24 03:15:36.96085649 +0000 UTC m=+1260.977927313" Feb 24 03:15:37 crc kubenswrapper[4923]: I0224 03:15:37.002168 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gpqwz"] Feb 24 03:15:37 crc kubenswrapper[4923]: W0224 03:15:37.027842 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5631e12f_8fa0_49b5_b6e8_7b0193f2a419.slice/crio-b17fbecf16bba7ff978c34087678afe6f27e307c2c4131fb2704a9aa175dbd18 WatchSource:0}: Error finding container b17fbecf16bba7ff978c34087678afe6f27e307c2c4131fb2704a9aa175dbd18: Status 404 returned error can't find the container with id b17fbecf16bba7ff978c34087678afe6f27e307c2c4131fb2704a9aa175dbd18 Feb 24 03:15:37 crc kubenswrapper[4923]: I0224 03:15:37.953479 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gpqwz" event={"ID":"5631e12f-8fa0-49b5-b6e8-7b0193f2a419","Type":"ContainerStarted","Data":"b8441050e113a7b10a77459432a3766d7265f920f3e08cd3238b1acfc4589fd7"} Feb 24 03:15:37 crc kubenswrapper[4923]: I0224 03:15:37.954098 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gpqwz" event={"ID":"5631e12f-8fa0-49b5-b6e8-7b0193f2a419","Type":"ContainerStarted","Data":"b17fbecf16bba7ff978c34087678afe6f27e307c2c4131fb2704a9aa175dbd18"} Feb 24 03:15:37 crc kubenswrapper[4923]: I0224 03:15:37.961017 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" event={"ID":"e95191de-44e3-4252-9fcf-cf159a81d3c8","Type":"ContainerStarted","Data":"5b28b64c4e316f96be21ee2a5821927ed644ecea46ab8f6ce7b0b5fe3c23bb2b"} Feb 24 03:15:37 crc kubenswrapper[4923]: I0224 03:15:37.961728 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:37 crc kubenswrapper[4923]: I0224 03:15:37.982143 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-gpqwz" podStartSLOduration=1.982126338 podStartE2EDuration="1.982126338s" podCreationTimestamp="2026-02-24 03:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:15:37.97368851 +0000 UTC m=+1261.990759333" watchObservedRunningTime="2026-02-24 03:15:37.982126338 +0000 UTC m=+1261.999197151" Feb 24 03:15:37 crc kubenswrapper[4923]: I0224 03:15:37.995771 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" podStartSLOduration=2.995753519 podStartE2EDuration="2.995753519s" podCreationTimestamp="2026-02-24 03:15:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:15:37.988662816 +0000 UTC m=+1262.005733629" watchObservedRunningTime="2026-02-24 03:15:37.995753519 +0000 UTC m=+1262.012824332" Feb 24 03:15:38 crc kubenswrapper[4923]: I0224 03:15:38.536250 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 24 03:15:38 crc kubenswrapper[4923]: I0224 03:15:38.563188 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:15:39 crc kubenswrapper[4923]: I0224 03:15:39.984406 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe","Type":"ContainerStarted","Data":"bcb89977dc05fb533bced30b23bf6f40d0da77a4e98962bd0c596204d2830aa6"} Feb 24 03:15:39 crc kubenswrapper[4923]: I0224 03:15:39.984587 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="d073d59f-dbca-42a7-b04e-9efcb0f2dbfe" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://bcb89977dc05fb533bced30b23bf6f40d0da77a4e98962bd0c596204d2830aa6" gracePeriod=30 Feb 24 03:15:39 crc kubenswrapper[4923]: I0224 03:15:39.990448 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc908a6f-cd0f-401c-a719-4663d10bae28","Type":"ContainerStarted","Data":"2ad27db264d1c68cc37e8f6b348c22adea60c1a5d4528091b76351a8a1e5085f"} Feb 24 03:15:40 crc kubenswrapper[4923]: I0224 03:15:40.010125 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0d1d2544-6a36-4e92-82d1-0cd1f937b362","Type":"ContainerStarted","Data":"9c59e84ebf0d4ef8572764e9118f99b4c8d9299ac6b6ebcdb9f759c57dd03598"} Feb 24 03:15:40 crc kubenswrapper[4923]: I0224 03:15:40.013806 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a092fee-177a-48c6-b7bf-e28195f851f6","Type":"ContainerStarted","Data":"6eec95db40ca8c77cf65d0cf0826cc7097cfe9b554a7db26bb8281d453ea7521"} Feb 24 03:15:40 crc kubenswrapper[4923]: I0224 03:15:40.021898 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.5185685380000002 podStartE2EDuration="6.021877021s" podCreationTimestamp="2026-02-24 03:15:34 +0000 UTC" firstStartedPulling="2026-02-24 03:15:35.998743598 +0000 UTC m=+1260.015814411" lastFinishedPulling="2026-02-24 03:15:39.502052081 +0000 UTC m=+1263.519122894" observedRunningTime="2026-02-24 03:15:40.00786262 +0000 UTC m=+1264.024933423" watchObservedRunningTime="2026-02-24 03:15:40.021877021 +0000 UTC m=+1264.038947834" Feb 24 03:15:40 crc kubenswrapper[4923]: I0224 03:15:40.031130 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.836749655 podStartE2EDuration="5.031109109s" podCreationTimestamp="2026-02-24 03:15:35 +0000 UTC" firstStartedPulling="2026-02-24 03:15:36.307863111 +0000 UTC m=+1260.324933924" lastFinishedPulling="2026-02-24 03:15:39.502222565 +0000 UTC m=+1263.519293378" observedRunningTime="2026-02-24 03:15:40.024871048 +0000 UTC m=+1264.041941871" watchObservedRunningTime="2026-02-24 03:15:40.031109109 +0000 UTC m=+1264.048179922" Feb 24 03:15:40 crc kubenswrapper[4923]: I0224 03:15:40.325385 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:15:40 crc kubenswrapper[4923]: I0224 03:15:40.642399 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.025008 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc908a6f-cd0f-401c-a719-4663d10bae28","Type":"ContainerStarted","Data":"14aaf6da984a3704a7906e5d3409fa240cc65848a9f0012f63c27f0a3196ce43"} Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.025104 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cc908a6f-cd0f-401c-a719-4663d10bae28" containerName="nova-metadata-log" containerID="cri-o://2ad27db264d1c68cc37e8f6b348c22adea60c1a5d4528091b76351a8a1e5085f" gracePeriod=30 Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.025150 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cc908a6f-cd0f-401c-a719-4663d10bae28" containerName="nova-metadata-metadata" containerID="cri-o://14aaf6da984a3704a7906e5d3409fa240cc65848a9f0012f63c27f0a3196ce43" gracePeriod=30 Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.032934 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a092fee-177a-48c6-b7bf-e28195f851f6","Type":"ContainerStarted","Data":"b912e50eedb0bc6a175403fc247fe52c5e467a301b549cdd885cf3e74465a23b"} Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.077960 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.883161339 podStartE2EDuration="6.077943264s" podCreationTimestamp="2026-02-24 03:15:35 +0000 UTC" firstStartedPulling="2026-02-24 03:15:36.307399609 +0000 UTC m=+1260.324470432" lastFinishedPulling="2026-02-24 03:15:39.502181544 +0000 UTC m=+1263.519252357" observedRunningTime="2026-02-24 03:15:41.075361518 +0000 UTC m=+1265.092432331" watchObservedRunningTime="2026-02-24 03:15:41.077943264 +0000 UTC m=+1265.095014077" Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.082274 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.03191412 podStartE2EDuration="6.082268556s" podCreationTimestamp="2026-02-24 03:15:35 +0000 UTC" firstStartedPulling="2026-02-24 03:15:36.464899585 +0000 UTC m=+1260.481970408" lastFinishedPulling="2026-02-24 03:15:39.515254031 +0000 UTC m=+1263.532324844" observedRunningTime="2026-02-24 03:15:41.058025761 +0000 UTC m=+1265.075096584" watchObservedRunningTime="2026-02-24 03:15:41.082268556 +0000 UTC m=+1265.099339369" Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.666629 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.868947 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc908a6f-cd0f-401c-a719-4663d10bae28-logs\") pod \"cc908a6f-cd0f-401c-a719-4663d10bae28\" (UID: \"cc908a6f-cd0f-401c-a719-4663d10bae28\") " Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.869055 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vzml\" (UniqueName: \"kubernetes.io/projected/cc908a6f-cd0f-401c-a719-4663d10bae28-kube-api-access-8vzml\") pod \"cc908a6f-cd0f-401c-a719-4663d10bae28\" (UID: \"cc908a6f-cd0f-401c-a719-4663d10bae28\") " Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.869195 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc908a6f-cd0f-401c-a719-4663d10bae28-combined-ca-bundle\") pod \"cc908a6f-cd0f-401c-a719-4663d10bae28\" (UID: \"cc908a6f-cd0f-401c-a719-4663d10bae28\") " Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.869324 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc908a6f-cd0f-401c-a719-4663d10bae28-logs" (OuterVolumeSpecName: "logs") pod "cc908a6f-cd0f-401c-a719-4663d10bae28" (UID: "cc908a6f-cd0f-401c-a719-4663d10bae28"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.870032 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc908a6f-cd0f-401c-a719-4663d10bae28-config-data\") pod \"cc908a6f-cd0f-401c-a719-4663d10bae28\" (UID: \"cc908a6f-cd0f-401c-a719-4663d10bae28\") " Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.870466 4923 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc908a6f-cd0f-401c-a719-4663d10bae28-logs\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.874205 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc908a6f-cd0f-401c-a719-4663d10bae28-kube-api-access-8vzml" (OuterVolumeSpecName: "kube-api-access-8vzml") pod "cc908a6f-cd0f-401c-a719-4663d10bae28" (UID: "cc908a6f-cd0f-401c-a719-4663d10bae28"). InnerVolumeSpecName "kube-api-access-8vzml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.894034 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc908a6f-cd0f-401c-a719-4663d10bae28-config-data" (OuterVolumeSpecName: "config-data") pod "cc908a6f-cd0f-401c-a719-4663d10bae28" (UID: "cc908a6f-cd0f-401c-a719-4663d10bae28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.898329 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc908a6f-cd0f-401c-a719-4663d10bae28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc908a6f-cd0f-401c-a719-4663d10bae28" (UID: "cc908a6f-cd0f-401c-a719-4663d10bae28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.972745 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc908a6f-cd0f-401c-a719-4663d10bae28-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.972783 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vzml\" (UniqueName: \"kubernetes.io/projected/cc908a6f-cd0f-401c-a719-4663d10bae28-kube-api-access-8vzml\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:41 crc kubenswrapper[4923]: I0224 03:15:41.972797 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc908a6f-cd0f-401c-a719-4663d10bae28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.045925 4923 generic.go:334] "Generic (PLEG): container finished" podID="cc908a6f-cd0f-401c-a719-4663d10bae28" containerID="14aaf6da984a3704a7906e5d3409fa240cc65848a9f0012f63c27f0a3196ce43" exitCode=0 Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.045963 4923 generic.go:334] "Generic (PLEG): container finished" podID="cc908a6f-cd0f-401c-a719-4663d10bae28" containerID="2ad27db264d1c68cc37e8f6b348c22adea60c1a5d4528091b76351a8a1e5085f" exitCode=143 Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.046005 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.046006 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc908a6f-cd0f-401c-a719-4663d10bae28","Type":"ContainerDied","Data":"14aaf6da984a3704a7906e5d3409fa240cc65848a9f0012f63c27f0a3196ce43"} Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.046078 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc908a6f-cd0f-401c-a719-4663d10bae28","Type":"ContainerDied","Data":"2ad27db264d1c68cc37e8f6b348c22adea60c1a5d4528091b76351a8a1e5085f"} Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.046099 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc908a6f-cd0f-401c-a719-4663d10bae28","Type":"ContainerDied","Data":"3f16dcfe0ebb7666e8c1cf8d94e71dd97c45320e9a21877b77747579b6d766a1"} Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.046118 4923 scope.go:117] "RemoveContainer" containerID="14aaf6da984a3704a7906e5d3409fa240cc65848a9f0012f63c27f0a3196ce43" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.075395 4923 scope.go:117] "RemoveContainer" containerID="2ad27db264d1c68cc37e8f6b348c22adea60c1a5d4528091b76351a8a1e5085f" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.102498 4923 scope.go:117] "RemoveContainer" containerID="14aaf6da984a3704a7906e5d3409fa240cc65848a9f0012f63c27f0a3196ce43" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.102657 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:15:42 crc kubenswrapper[4923]: E0224 03:15:42.103265 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14aaf6da984a3704a7906e5d3409fa240cc65848a9f0012f63c27f0a3196ce43\": container with ID starting with 14aaf6da984a3704a7906e5d3409fa240cc65848a9f0012f63c27f0a3196ce43 not found: ID does not exist" containerID="14aaf6da984a3704a7906e5d3409fa240cc65848a9f0012f63c27f0a3196ce43" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.103320 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14aaf6da984a3704a7906e5d3409fa240cc65848a9f0012f63c27f0a3196ce43"} err="failed to get container status \"14aaf6da984a3704a7906e5d3409fa240cc65848a9f0012f63c27f0a3196ce43\": rpc error: code = NotFound desc = could not find container \"14aaf6da984a3704a7906e5d3409fa240cc65848a9f0012f63c27f0a3196ce43\": container with ID starting with 14aaf6da984a3704a7906e5d3409fa240cc65848a9f0012f63c27f0a3196ce43 not found: ID does not exist" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.103347 4923 scope.go:117] "RemoveContainer" containerID="2ad27db264d1c68cc37e8f6b348c22adea60c1a5d4528091b76351a8a1e5085f" Feb 24 03:15:42 crc kubenswrapper[4923]: E0224 03:15:42.103658 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ad27db264d1c68cc37e8f6b348c22adea60c1a5d4528091b76351a8a1e5085f\": container with ID starting with 2ad27db264d1c68cc37e8f6b348c22adea60c1a5d4528091b76351a8a1e5085f not found: ID does not exist" containerID="2ad27db264d1c68cc37e8f6b348c22adea60c1a5d4528091b76351a8a1e5085f" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.103692 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ad27db264d1c68cc37e8f6b348c22adea60c1a5d4528091b76351a8a1e5085f"} err="failed to get container status \"2ad27db264d1c68cc37e8f6b348c22adea60c1a5d4528091b76351a8a1e5085f\": rpc error: code = NotFound desc = could not find container \"2ad27db264d1c68cc37e8f6b348c22adea60c1a5d4528091b76351a8a1e5085f\": container with ID starting with 2ad27db264d1c68cc37e8f6b348c22adea60c1a5d4528091b76351a8a1e5085f not found: ID does not exist" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.103720 4923 scope.go:117] "RemoveContainer" containerID="14aaf6da984a3704a7906e5d3409fa240cc65848a9f0012f63c27f0a3196ce43" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.104004 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14aaf6da984a3704a7906e5d3409fa240cc65848a9f0012f63c27f0a3196ce43"} err="failed to get container status \"14aaf6da984a3704a7906e5d3409fa240cc65848a9f0012f63c27f0a3196ce43\": rpc error: code = NotFound desc = could not find container \"14aaf6da984a3704a7906e5d3409fa240cc65848a9f0012f63c27f0a3196ce43\": container with ID starting with 14aaf6da984a3704a7906e5d3409fa240cc65848a9f0012f63c27f0a3196ce43 not found: ID does not exist" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.104030 4923 scope.go:117] "RemoveContainer" containerID="2ad27db264d1c68cc37e8f6b348c22adea60c1a5d4528091b76351a8a1e5085f" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.104248 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ad27db264d1c68cc37e8f6b348c22adea60c1a5d4528091b76351a8a1e5085f"} err="failed to get container status \"2ad27db264d1c68cc37e8f6b348c22adea60c1a5d4528091b76351a8a1e5085f\": rpc error: code = NotFound desc = could not find container \"2ad27db264d1c68cc37e8f6b348c22adea60c1a5d4528091b76351a8a1e5085f\": container with ID starting with 2ad27db264d1c68cc37e8f6b348c22adea60c1a5d4528091b76351a8a1e5085f not found: ID does not exist" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.115510 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.132957 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.137771 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:15:42 crc kubenswrapper[4923]: E0224 03:15:42.138198 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc908a6f-cd0f-401c-a719-4663d10bae28" containerName="nova-metadata-log" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.138218 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc908a6f-cd0f-401c-a719-4663d10bae28" containerName="nova-metadata-log" Feb 24 03:15:42 crc kubenswrapper[4923]: E0224 03:15:42.138262 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc908a6f-cd0f-401c-a719-4663d10bae28" containerName="nova-metadata-metadata" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.138273 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc908a6f-cd0f-401c-a719-4663d10bae28" containerName="nova-metadata-metadata" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.138502 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc908a6f-cd0f-401c-a719-4663d10bae28" containerName="nova-metadata-log" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.138532 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc908a6f-cd0f-401c-a719-4663d10bae28" containerName="nova-metadata-metadata" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.139756 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.141678 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.141843 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.160829 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.278584 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzsbf\" (UniqueName: \"kubernetes.io/projected/83200aaa-baf7-4458-abe5-2972d8da4cf0-kube-api-access-xzsbf\") pod \"nova-metadata-0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " pod="openstack/nova-metadata-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.278655 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83200aaa-baf7-4458-abe5-2972d8da4cf0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " pod="openstack/nova-metadata-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.278804 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/83200aaa-baf7-4458-abe5-2972d8da4cf0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " pod="openstack/nova-metadata-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.278847 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83200aaa-baf7-4458-abe5-2972d8da4cf0-logs\") pod \"nova-metadata-0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " pod="openstack/nova-metadata-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.278896 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83200aaa-baf7-4458-abe5-2972d8da4cf0-config-data\") pod \"nova-metadata-0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " pod="openstack/nova-metadata-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.380732 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83200aaa-baf7-4458-abe5-2972d8da4cf0-logs\") pod \"nova-metadata-0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " pod="openstack/nova-metadata-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.380792 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83200aaa-baf7-4458-abe5-2972d8da4cf0-config-data\") pod \"nova-metadata-0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " pod="openstack/nova-metadata-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.380878 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzsbf\" (UniqueName: \"kubernetes.io/projected/83200aaa-baf7-4458-abe5-2972d8da4cf0-kube-api-access-xzsbf\") pod \"nova-metadata-0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " pod="openstack/nova-metadata-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.380921 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83200aaa-baf7-4458-abe5-2972d8da4cf0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " pod="openstack/nova-metadata-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.380979 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/83200aaa-baf7-4458-abe5-2972d8da4cf0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " pod="openstack/nova-metadata-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.381203 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83200aaa-baf7-4458-abe5-2972d8da4cf0-logs\") pod \"nova-metadata-0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " pod="openstack/nova-metadata-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.385142 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/83200aaa-baf7-4458-abe5-2972d8da4cf0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " pod="openstack/nova-metadata-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.385926 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83200aaa-baf7-4458-abe5-2972d8da4cf0-config-data\") pod \"nova-metadata-0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " pod="openstack/nova-metadata-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.387005 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83200aaa-baf7-4458-abe5-2972d8da4cf0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " pod="openstack/nova-metadata-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.399989 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzsbf\" (UniqueName: \"kubernetes.io/projected/83200aaa-baf7-4458-abe5-2972d8da4cf0-kube-api-access-xzsbf\") pod \"nova-metadata-0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " pod="openstack/nova-metadata-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.466711 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 24 03:15:42 crc kubenswrapper[4923]: I0224 03:15:42.990544 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:15:43 crc kubenswrapper[4923]: I0224 03:15:43.056830 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83200aaa-baf7-4458-abe5-2972d8da4cf0","Type":"ContainerStarted","Data":"3d6258f6ed8c4959ce2d80b7dd77e0a31ea89b057a58ca40ec564746806dcf95"} Feb 24 03:15:43 crc kubenswrapper[4923]: I0224 03:15:43.722600 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc908a6f-cd0f-401c-a719-4663d10bae28" path="/var/lib/kubelet/pods/cc908a6f-cd0f-401c-a719-4663d10bae28/volumes" Feb 24 03:15:44 crc kubenswrapper[4923]: I0224 03:15:44.077682 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83200aaa-baf7-4458-abe5-2972d8da4cf0","Type":"ContainerStarted","Data":"f10fb5852addd69a3e7f24b36009b9d8f7d9cb8fe545c221df46830beafd56f0"} Feb 24 03:15:44 crc kubenswrapper[4923]: I0224 03:15:44.077717 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83200aaa-baf7-4458-abe5-2972d8da4cf0","Type":"ContainerStarted","Data":"48828d2bdcb196e5ffbf4f7e68b1b57add9802708ecfb185e4134f594334245b"} Feb 24 03:15:44 crc kubenswrapper[4923]: I0224 03:15:44.079333 4923 generic.go:334] "Generic (PLEG): container finished" podID="557d4e5b-4b4a-4eee-b199-533822f52b8f" containerID="7c9d389928b4202b4e7b7c4b3abaa2bbde602b2a213d7606a60e186324f4751f" exitCode=0 Feb 24 03:15:44 crc kubenswrapper[4923]: I0224 03:15:44.079361 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-clrmb" event={"ID":"557d4e5b-4b4a-4eee-b199-533822f52b8f","Type":"ContainerDied","Data":"7c9d389928b4202b4e7b7c4b3abaa2bbde602b2a213d7606a60e186324f4751f"} Feb 24 03:15:44 crc kubenswrapper[4923]: I0224 03:15:44.100469 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.100449002 podStartE2EDuration="2.100449002s" podCreationTimestamp="2026-02-24 03:15:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:15:44.096543931 +0000 UTC m=+1268.113614744" watchObservedRunningTime="2026-02-24 03:15:44.100449002 +0000 UTC m=+1268.117519815" Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.091419 4923 generic.go:334] "Generic (PLEG): container finished" podID="5631e12f-8fa0-49b5-b6e8-7b0193f2a419" containerID="b8441050e113a7b10a77459432a3766d7265f920f3e08cd3238b1acfc4589fd7" exitCode=0 Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.091548 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gpqwz" event={"ID":"5631e12f-8fa0-49b5-b6e8-7b0193f2a419","Type":"ContainerDied","Data":"b8441050e113a7b10a77459432a3766d7265f920f3e08cd3238b1acfc4589fd7"} Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.506468 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-clrmb" Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.642288 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.644661 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/557d4e5b-4b4a-4eee-b199-533822f52b8f-combined-ca-bundle\") pod \"557d4e5b-4b4a-4eee-b199-533822f52b8f\" (UID: \"557d4e5b-4b4a-4eee-b199-533822f52b8f\") " Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.644730 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w4kz\" (UniqueName: \"kubernetes.io/projected/557d4e5b-4b4a-4eee-b199-533822f52b8f-kube-api-access-2w4kz\") pod \"557d4e5b-4b4a-4eee-b199-533822f52b8f\" (UID: \"557d4e5b-4b4a-4eee-b199-533822f52b8f\") " Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.644883 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/557d4e5b-4b4a-4eee-b199-533822f52b8f-config-data\") pod \"557d4e5b-4b4a-4eee-b199-533822f52b8f\" (UID: \"557d4e5b-4b4a-4eee-b199-533822f52b8f\") " Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.644979 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/557d4e5b-4b4a-4eee-b199-533822f52b8f-scripts\") pod \"557d4e5b-4b4a-4eee-b199-533822f52b8f\" (UID: \"557d4e5b-4b4a-4eee-b199-533822f52b8f\") " Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.652528 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/557d4e5b-4b4a-4eee-b199-533822f52b8f-scripts" (OuterVolumeSpecName: "scripts") pod "557d4e5b-4b4a-4eee-b199-533822f52b8f" (UID: "557d4e5b-4b4a-4eee-b199-533822f52b8f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.652698 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/557d4e5b-4b4a-4eee-b199-533822f52b8f-kube-api-access-2w4kz" (OuterVolumeSpecName: "kube-api-access-2w4kz") pod "557d4e5b-4b4a-4eee-b199-533822f52b8f" (UID: "557d4e5b-4b4a-4eee-b199-533822f52b8f"). InnerVolumeSpecName "kube-api-access-2w4kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.678441 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.683529 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/557d4e5b-4b4a-4eee-b199-533822f52b8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "557d4e5b-4b4a-4eee-b199-533822f52b8f" (UID: "557d4e5b-4b4a-4eee-b199-533822f52b8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.687875 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.689689 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/557d4e5b-4b4a-4eee-b199-533822f52b8f-config-data" (OuterVolumeSpecName: "config-data") pod "557d4e5b-4b4a-4eee-b199-533822f52b8f" (UID: "557d4e5b-4b4a-4eee-b199-533822f52b8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.733248 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.733741 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.755486 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/557d4e5b-4b4a-4eee-b199-533822f52b8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.755517 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w4kz\" (UniqueName: \"kubernetes.io/projected/557d4e5b-4b4a-4eee-b199-533822f52b8f-kube-api-access-2w4kz\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.755531 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/557d4e5b-4b4a-4eee-b199-533822f52b8f-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.755543 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/557d4e5b-4b4a-4eee-b199-533822f52b8f-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.757806 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-7g9px"] Feb 24 03:15:45 crc kubenswrapper[4923]: I0224 03:15:45.758079 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-7g9px" podUID="ec526ce6-1884-41fb-a3f1-070c22309734" containerName="dnsmasq-dns" containerID="cri-o://f114ca67678ea2d1c796aa96edf085f5b978e6d7c16f28713f642f436d9688be" gracePeriod=10 Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.134812 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-clrmb" event={"ID":"557d4e5b-4b4a-4eee-b199-533822f52b8f","Type":"ContainerDied","Data":"f0fc24e883641edf128c8c3fa20f0c1808e67b66478bb088481f9504474a1cb3"} Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.135896 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0fc24e883641edf128c8c3fa20f0c1808e67b66478bb088481f9504474a1cb3" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.134814 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-clrmb" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.136759 4923 generic.go:334] "Generic (PLEG): container finished" podID="ec526ce6-1884-41fb-a3f1-070c22309734" containerID="f114ca67678ea2d1c796aa96edf085f5b978e6d7c16f28713f642f436d9688be" exitCode=0 Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.136794 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-7g9px" event={"ID":"ec526ce6-1884-41fb-a3f1-070c22309734","Type":"ContainerDied","Data":"f114ca67678ea2d1c796aa96edf085f5b978e6d7c16f28713f642f436d9688be"} Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.337051 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.357869 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.359209 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5a092fee-177a-48c6-b7bf-e28195f851f6" containerName="nova-api-log" containerID="cri-o://6eec95db40ca8c77cf65d0cf0826cc7097cfe9b554a7db26bb8281d453ea7521" gracePeriod=30 Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.359370 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5a092fee-177a-48c6-b7bf-e28195f851f6" containerName="nova-api-api" containerID="cri-o://b912e50eedb0bc6a175403fc247fe52c5e467a301b549cdd885cf3e74465a23b" gracePeriod=30 Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.364674 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.371138 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5a092fee-177a-48c6-b7bf-e28195f851f6" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": EOF" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.371431 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5a092fee-177a-48c6-b7bf-e28195f851f6" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": EOF" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.385995 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-ovsdbserver-sb\") pod \"ec526ce6-1884-41fb-a3f1-070c22309734\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.386048 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-ovsdbserver-nb\") pod \"ec526ce6-1884-41fb-a3f1-070c22309734\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.386121 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-dns-swift-storage-0\") pod \"ec526ce6-1884-41fb-a3f1-070c22309734\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.386158 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzdh4\" (UniqueName: \"kubernetes.io/projected/ec526ce6-1884-41fb-a3f1-070c22309734-kube-api-access-fzdh4\") pod \"ec526ce6-1884-41fb-a3f1-070c22309734\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.386245 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-dns-svc\") pod \"ec526ce6-1884-41fb-a3f1-070c22309734\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.386280 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-config\") pod \"ec526ce6-1884-41fb-a3f1-070c22309734\" (UID: \"ec526ce6-1884-41fb-a3f1-070c22309734\") " Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.407943 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec526ce6-1884-41fb-a3f1-070c22309734-kube-api-access-fzdh4" (OuterVolumeSpecName: "kube-api-access-fzdh4") pod "ec526ce6-1884-41fb-a3f1-070c22309734" (UID: "ec526ce6-1884-41fb-a3f1-070c22309734"). InnerVolumeSpecName "kube-api-access-fzdh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.445359 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.488435 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzdh4\" (UniqueName: \"kubernetes.io/projected/ec526ce6-1884-41fb-a3f1-070c22309734-kube-api-access-fzdh4\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.518773 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ec526ce6-1884-41fb-a3f1-070c22309734" (UID: "ec526ce6-1884-41fb-a3f1-070c22309734"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.535925 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.536113 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="83200aaa-baf7-4458-abe5-2972d8da4cf0" containerName="nova-metadata-log" containerID="cri-o://48828d2bdcb196e5ffbf4f7e68b1b57add9802708ecfb185e4134f594334245b" gracePeriod=30 Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.536530 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="83200aaa-baf7-4458-abe5-2972d8da4cf0" containerName="nova-metadata-metadata" containerID="cri-o://f10fb5852addd69a3e7f24b36009b9d8f7d9cb8fe545c221df46830beafd56f0" gracePeriod=30 Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.544145 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ec526ce6-1884-41fb-a3f1-070c22309734" (UID: "ec526ce6-1884-41fb-a3f1-070c22309734"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.544706 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ec526ce6-1884-41fb-a3f1-070c22309734" (UID: "ec526ce6-1884-41fb-a3f1-070c22309734"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.556615 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ec526ce6-1884-41fb-a3f1-070c22309734" (UID: "ec526ce6-1884-41fb-a3f1-070c22309734"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.571106 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-config" (OuterVolumeSpecName: "config") pod "ec526ce6-1884-41fb-a3f1-070c22309734" (UID: "ec526ce6-1884-41fb-a3f1-070c22309734"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.594892 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.594958 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.594970 4923 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.594981 4923 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.594989 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec526ce6-1884-41fb-a3f1-070c22309734-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:46 crc kubenswrapper[4923]: E0224 03:15:46.638779 4923 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a092fee_177a_48c6_b7bf_e28195f851f6.slice/crio-6eec95db40ca8c77cf65d0cf0826cc7097cfe9b554a7db26bb8281d453ea7521.scope\": RecentStats: unable to find data in memory cache]" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.777724 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gpqwz" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.799102 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7wnv\" (UniqueName: \"kubernetes.io/projected/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-kube-api-access-c7wnv\") pod \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\" (UID: \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\") " Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.799166 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-scripts\") pod \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\" (UID: \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\") " Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.799193 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-config-data\") pod \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\" (UID: \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\") " Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.799258 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-combined-ca-bundle\") pod \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\" (UID: \"5631e12f-8fa0-49b5-b6e8-7b0193f2a419\") " Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.807893 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-scripts" (OuterVolumeSpecName: "scripts") pod "5631e12f-8fa0-49b5-b6e8-7b0193f2a419" (UID: "5631e12f-8fa0-49b5-b6e8-7b0193f2a419"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.813681 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-kube-api-access-c7wnv" (OuterVolumeSpecName: "kube-api-access-c7wnv") pod "5631e12f-8fa0-49b5-b6e8-7b0193f2a419" (UID: "5631e12f-8fa0-49b5-b6e8-7b0193f2a419"). InnerVolumeSpecName "kube-api-access-c7wnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.828880 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5631e12f-8fa0-49b5-b6e8-7b0193f2a419" (UID: "5631e12f-8fa0-49b5-b6e8-7b0193f2a419"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.851735 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-config-data" (OuterVolumeSpecName: "config-data") pod "5631e12f-8fa0-49b5-b6e8-7b0193f2a419" (UID: "5631e12f-8fa0-49b5-b6e8-7b0193f2a419"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.905956 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.906234 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7wnv\" (UniqueName: \"kubernetes.io/projected/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-kube-api-access-c7wnv\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.906246 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:46 crc kubenswrapper[4923]: I0224 03:15:46.906255 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5631e12f-8fa0-49b5-b6e8-7b0193f2a419-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.012670 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.150103 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-7g9px" event={"ID":"ec526ce6-1884-41fb-a3f1-070c22309734","Type":"ContainerDied","Data":"09763cd3b326c17c848fd34115a0ba18160cbfcd6b331cbb102088fe279ba6cf"} Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.150158 4923 scope.go:117] "RemoveContainer" containerID="f114ca67678ea2d1c796aa96edf085f5b978e6d7c16f28713f642f436d9688be" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.150172 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-7g9px" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.154552 4923 generic.go:334] "Generic (PLEG): container finished" podID="5a092fee-177a-48c6-b7bf-e28195f851f6" containerID="6eec95db40ca8c77cf65d0cf0826cc7097cfe9b554a7db26bb8281d453ea7521" exitCode=143 Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.154646 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a092fee-177a-48c6-b7bf-e28195f851f6","Type":"ContainerDied","Data":"6eec95db40ca8c77cf65d0cf0826cc7097cfe9b554a7db26bb8281d453ea7521"} Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.156630 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gpqwz" event={"ID":"5631e12f-8fa0-49b5-b6e8-7b0193f2a419","Type":"ContainerDied","Data":"b17fbecf16bba7ff978c34087678afe6f27e307c2c4131fb2704a9aa175dbd18"} Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.156989 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b17fbecf16bba7ff978c34087678afe6f27e307c2c4131fb2704a9aa175dbd18" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.157424 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gpqwz" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.162126 4923 generic.go:334] "Generic (PLEG): container finished" podID="83200aaa-baf7-4458-abe5-2972d8da4cf0" containerID="f10fb5852addd69a3e7f24b36009b9d8f7d9cb8fe545c221df46830beafd56f0" exitCode=0 Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.162160 4923 generic.go:334] "Generic (PLEG): container finished" podID="83200aaa-baf7-4458-abe5-2972d8da4cf0" containerID="48828d2bdcb196e5ffbf4f7e68b1b57add9802708ecfb185e4134f594334245b" exitCode=143 Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.162513 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.163044 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83200aaa-baf7-4458-abe5-2972d8da4cf0","Type":"ContainerDied","Data":"f10fb5852addd69a3e7f24b36009b9d8f7d9cb8fe545c221df46830beafd56f0"} Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.163081 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83200aaa-baf7-4458-abe5-2972d8da4cf0","Type":"ContainerDied","Data":"48828d2bdcb196e5ffbf4f7e68b1b57add9802708ecfb185e4134f594334245b"} Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.163096 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83200aaa-baf7-4458-abe5-2972d8da4cf0","Type":"ContainerDied","Data":"3d6258f6ed8c4959ce2d80b7dd77e0a31ea89b057a58ca40ec564746806dcf95"} Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.190365 4923 scope.go:117] "RemoveContainer" containerID="e37c5af9022ba26453fae52f6c25c27668f5470ea7c4070326747c0705bf4131" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.210811 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzsbf\" (UniqueName: \"kubernetes.io/projected/83200aaa-baf7-4458-abe5-2972d8da4cf0-kube-api-access-xzsbf\") pod \"83200aaa-baf7-4458-abe5-2972d8da4cf0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.210933 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83200aaa-baf7-4458-abe5-2972d8da4cf0-logs\") pod \"83200aaa-baf7-4458-abe5-2972d8da4cf0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.211011 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83200aaa-baf7-4458-abe5-2972d8da4cf0-combined-ca-bundle\") pod \"83200aaa-baf7-4458-abe5-2972d8da4cf0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.211452 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83200aaa-baf7-4458-abe5-2972d8da4cf0-config-data\") pod \"83200aaa-baf7-4458-abe5-2972d8da4cf0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.211478 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/83200aaa-baf7-4458-abe5-2972d8da4cf0-nova-metadata-tls-certs\") pod \"83200aaa-baf7-4458-abe5-2972d8da4cf0\" (UID: \"83200aaa-baf7-4458-abe5-2972d8da4cf0\") " Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.211774 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83200aaa-baf7-4458-abe5-2972d8da4cf0-logs" (OuterVolumeSpecName: "logs") pod "83200aaa-baf7-4458-abe5-2972d8da4cf0" (UID: "83200aaa-baf7-4458-abe5-2972d8da4cf0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.216097 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 24 03:15:47 crc kubenswrapper[4923]: E0224 03:15:47.216798 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557d4e5b-4b4a-4eee-b199-533822f52b8f" containerName="nova-manage" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.216886 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="557d4e5b-4b4a-4eee-b199-533822f52b8f" containerName="nova-manage" Feb 24 03:15:47 crc kubenswrapper[4923]: E0224 03:15:47.216973 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83200aaa-baf7-4458-abe5-2972d8da4cf0" containerName="nova-metadata-log" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.217041 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="83200aaa-baf7-4458-abe5-2972d8da4cf0" containerName="nova-metadata-log" Feb 24 03:15:47 crc kubenswrapper[4923]: E0224 03:15:47.217137 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83200aaa-baf7-4458-abe5-2972d8da4cf0" containerName="nova-metadata-metadata" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.217199 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="83200aaa-baf7-4458-abe5-2972d8da4cf0" containerName="nova-metadata-metadata" Feb 24 03:15:47 crc kubenswrapper[4923]: E0224 03:15:47.217262 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec526ce6-1884-41fb-a3f1-070c22309734" containerName="dnsmasq-dns" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.217335 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec526ce6-1884-41fb-a3f1-070c22309734" containerName="dnsmasq-dns" Feb 24 03:15:47 crc kubenswrapper[4923]: E0224 03:15:47.217390 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5631e12f-8fa0-49b5-b6e8-7b0193f2a419" containerName="nova-cell1-conductor-db-sync" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.217439 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="5631e12f-8fa0-49b5-b6e8-7b0193f2a419" containerName="nova-cell1-conductor-db-sync" Feb 24 03:15:47 crc kubenswrapper[4923]: E0224 03:15:47.217510 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec526ce6-1884-41fb-a3f1-070c22309734" containerName="init" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.217566 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec526ce6-1884-41fb-a3f1-070c22309734" containerName="init" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.217785 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="5631e12f-8fa0-49b5-b6e8-7b0193f2a419" containerName="nova-cell1-conductor-db-sync" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.217864 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec526ce6-1884-41fb-a3f1-070c22309734" containerName="dnsmasq-dns" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.218536 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="83200aaa-baf7-4458-abe5-2972d8da4cf0" containerName="nova-metadata-log" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.218599 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="83200aaa-baf7-4458-abe5-2972d8da4cf0" containerName="nova-metadata-metadata" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.218664 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="557d4e5b-4b4a-4eee-b199-533822f52b8f" containerName="nova-manage" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.219263 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.222486 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.246576 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83200aaa-baf7-4458-abe5-2972d8da4cf0-kube-api-access-xzsbf" (OuterVolumeSpecName: "kube-api-access-xzsbf") pod "83200aaa-baf7-4458-abe5-2972d8da4cf0" (UID: "83200aaa-baf7-4458-abe5-2972d8da4cf0"). InnerVolumeSpecName "kube-api-access-xzsbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.258412 4923 scope.go:117] "RemoveContainer" containerID="f10fb5852addd69a3e7f24b36009b9d8f7d9cb8fe545c221df46830beafd56f0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.259305 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83200aaa-baf7-4458-abe5-2972d8da4cf0-config-data" (OuterVolumeSpecName: "config-data") pod "83200aaa-baf7-4458-abe5-2972d8da4cf0" (UID: "83200aaa-baf7-4458-abe5-2972d8da4cf0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.267458 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83200aaa-baf7-4458-abe5-2972d8da4cf0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83200aaa-baf7-4458-abe5-2972d8da4cf0" (UID: "83200aaa-baf7-4458-abe5-2972d8da4cf0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.276753 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-7g9px"] Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.289151 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83200aaa-baf7-4458-abe5-2972d8da4cf0-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "83200aaa-baf7-4458-abe5-2972d8da4cf0" (UID: "83200aaa-baf7-4458-abe5-2972d8da4cf0"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.293863 4923 scope.go:117] "RemoveContainer" containerID="48828d2bdcb196e5ffbf4f7e68b1b57add9802708ecfb185e4134f594334245b" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.293997 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.312373 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-7g9px"] Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.313122 4923 scope.go:117] "RemoveContainer" containerID="f10fb5852addd69a3e7f24b36009b9d8f7d9cb8fe545c221df46830beafd56f0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.313922 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5130427e-ca28-4060-ac80-72202959e07f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5130427e-ca28-4060-ac80-72202959e07f\") " pod="openstack/nova-cell1-conductor-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.314003 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5130427e-ca28-4060-ac80-72202959e07f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5130427e-ca28-4060-ac80-72202959e07f\") " pod="openstack/nova-cell1-conductor-0" Feb 24 03:15:47 crc kubenswrapper[4923]: E0224 03:15:47.314102 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f10fb5852addd69a3e7f24b36009b9d8f7d9cb8fe545c221df46830beafd56f0\": container with ID starting with f10fb5852addd69a3e7f24b36009b9d8f7d9cb8fe545c221df46830beafd56f0 not found: ID does not exist" containerID="f10fb5852addd69a3e7f24b36009b9d8f7d9cb8fe545c221df46830beafd56f0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.314116 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpdvp\" (UniqueName: \"kubernetes.io/projected/5130427e-ca28-4060-ac80-72202959e07f-kube-api-access-wpdvp\") pod \"nova-cell1-conductor-0\" (UID: \"5130427e-ca28-4060-ac80-72202959e07f\") " pod="openstack/nova-cell1-conductor-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.314135 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f10fb5852addd69a3e7f24b36009b9d8f7d9cb8fe545c221df46830beafd56f0"} err="failed to get container status \"f10fb5852addd69a3e7f24b36009b9d8f7d9cb8fe545c221df46830beafd56f0\": rpc error: code = NotFound desc = could not find container \"f10fb5852addd69a3e7f24b36009b9d8f7d9cb8fe545c221df46830beafd56f0\": container with ID starting with f10fb5852addd69a3e7f24b36009b9d8f7d9cb8fe545c221df46830beafd56f0 not found: ID does not exist" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.314160 4923 scope.go:117] "RemoveContainer" containerID="48828d2bdcb196e5ffbf4f7e68b1b57add9802708ecfb185e4134f594334245b" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.314235 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83200aaa-baf7-4458-abe5-2972d8da4cf0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.314266 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83200aaa-baf7-4458-abe5-2972d8da4cf0-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.314278 4923 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/83200aaa-baf7-4458-abe5-2972d8da4cf0-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.314290 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzsbf\" (UniqueName: \"kubernetes.io/projected/83200aaa-baf7-4458-abe5-2972d8da4cf0-kube-api-access-xzsbf\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.314333 4923 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83200aaa-baf7-4458-abe5-2972d8da4cf0-logs\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:47 crc kubenswrapper[4923]: E0224 03:15:47.314577 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48828d2bdcb196e5ffbf4f7e68b1b57add9802708ecfb185e4134f594334245b\": container with ID starting with 48828d2bdcb196e5ffbf4f7e68b1b57add9802708ecfb185e4134f594334245b not found: ID does not exist" containerID="48828d2bdcb196e5ffbf4f7e68b1b57add9802708ecfb185e4134f594334245b" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.314601 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48828d2bdcb196e5ffbf4f7e68b1b57add9802708ecfb185e4134f594334245b"} err="failed to get container status \"48828d2bdcb196e5ffbf4f7e68b1b57add9802708ecfb185e4134f594334245b\": rpc error: code = NotFound desc = could not find container \"48828d2bdcb196e5ffbf4f7e68b1b57add9802708ecfb185e4134f594334245b\": container with ID starting with 48828d2bdcb196e5ffbf4f7e68b1b57add9802708ecfb185e4134f594334245b not found: ID does not exist" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.314654 4923 scope.go:117] "RemoveContainer" containerID="f10fb5852addd69a3e7f24b36009b9d8f7d9cb8fe545c221df46830beafd56f0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.314875 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f10fb5852addd69a3e7f24b36009b9d8f7d9cb8fe545c221df46830beafd56f0"} err="failed to get container status \"f10fb5852addd69a3e7f24b36009b9d8f7d9cb8fe545c221df46830beafd56f0\": rpc error: code = NotFound desc = could not find container \"f10fb5852addd69a3e7f24b36009b9d8f7d9cb8fe545c221df46830beafd56f0\": container with ID starting with f10fb5852addd69a3e7f24b36009b9d8f7d9cb8fe545c221df46830beafd56f0 not found: ID does not exist" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.314901 4923 scope.go:117] "RemoveContainer" containerID="48828d2bdcb196e5ffbf4f7e68b1b57add9802708ecfb185e4134f594334245b" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.315564 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48828d2bdcb196e5ffbf4f7e68b1b57add9802708ecfb185e4134f594334245b"} err="failed to get container status \"48828d2bdcb196e5ffbf4f7e68b1b57add9802708ecfb185e4134f594334245b\": rpc error: code = NotFound desc = could not find container \"48828d2bdcb196e5ffbf4f7e68b1b57add9802708ecfb185e4134f594334245b\": container with ID starting with 48828d2bdcb196e5ffbf4f7e68b1b57add9802708ecfb185e4134f594334245b not found: ID does not exist" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.415251 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5130427e-ca28-4060-ac80-72202959e07f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5130427e-ca28-4060-ac80-72202959e07f\") " pod="openstack/nova-cell1-conductor-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.415754 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5130427e-ca28-4060-ac80-72202959e07f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5130427e-ca28-4060-ac80-72202959e07f\") " pod="openstack/nova-cell1-conductor-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.415813 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpdvp\" (UniqueName: \"kubernetes.io/projected/5130427e-ca28-4060-ac80-72202959e07f-kube-api-access-wpdvp\") pod \"nova-cell1-conductor-0\" (UID: \"5130427e-ca28-4060-ac80-72202959e07f\") " pod="openstack/nova-cell1-conductor-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.419886 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5130427e-ca28-4060-ac80-72202959e07f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5130427e-ca28-4060-ac80-72202959e07f\") " pod="openstack/nova-cell1-conductor-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.434984 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5130427e-ca28-4060-ac80-72202959e07f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5130427e-ca28-4060-ac80-72202959e07f\") " pod="openstack/nova-cell1-conductor-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.439037 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpdvp\" (UniqueName: \"kubernetes.io/projected/5130427e-ca28-4060-ac80-72202959e07f-kube-api-access-wpdvp\") pod \"nova-cell1-conductor-0\" (UID: \"5130427e-ca28-4060-ac80-72202959e07f\") " pod="openstack/nova-cell1-conductor-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.570912 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.573032 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.584926 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.594569 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.628955 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.637789 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.638036 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.663864 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.724074 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83200aaa-baf7-4458-abe5-2972d8da4cf0" path="/var/lib/kubelet/pods/83200aaa-baf7-4458-abe5-2972d8da4cf0/volumes" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.725065 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec526ce6-1884-41fb-a3f1-070c22309734" path="/var/lib/kubelet/pods/ec526ce6-1884-41fb-a3f1-070c22309734/volumes" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.728413 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a19b08f-d8dc-4bf6-b907-f349739f12b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " pod="openstack/nova-metadata-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.728482 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hprzq\" (UniqueName: \"kubernetes.io/projected/4a19b08f-d8dc-4bf6-b907-f349739f12b8-kube-api-access-hprzq\") pod \"nova-metadata-0\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " pod="openstack/nova-metadata-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.728667 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a19b08f-d8dc-4bf6-b907-f349739f12b8-config-data\") pod \"nova-metadata-0\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " pod="openstack/nova-metadata-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.728691 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a19b08f-d8dc-4bf6-b907-f349739f12b8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " pod="openstack/nova-metadata-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.728724 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a19b08f-d8dc-4bf6-b907-f349739f12b8-logs\") pod \"nova-metadata-0\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " pod="openstack/nova-metadata-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.830108 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a19b08f-d8dc-4bf6-b907-f349739f12b8-logs\") pod \"nova-metadata-0\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " pod="openstack/nova-metadata-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.830201 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a19b08f-d8dc-4bf6-b907-f349739f12b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " pod="openstack/nova-metadata-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.830235 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hprzq\" (UniqueName: \"kubernetes.io/projected/4a19b08f-d8dc-4bf6-b907-f349739f12b8-kube-api-access-hprzq\") pod \"nova-metadata-0\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " pod="openstack/nova-metadata-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.830324 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a19b08f-d8dc-4bf6-b907-f349739f12b8-config-data\") pod \"nova-metadata-0\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " pod="openstack/nova-metadata-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.830341 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a19b08f-d8dc-4bf6-b907-f349739f12b8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " pod="openstack/nova-metadata-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.831139 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a19b08f-d8dc-4bf6-b907-f349739f12b8-logs\") pod \"nova-metadata-0\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " pod="openstack/nova-metadata-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.837966 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a19b08f-d8dc-4bf6-b907-f349739f12b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " pod="openstack/nova-metadata-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.847148 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a19b08f-d8dc-4bf6-b907-f349739f12b8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " pod="openstack/nova-metadata-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.848868 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a19b08f-d8dc-4bf6-b907-f349739f12b8-config-data\") pod \"nova-metadata-0\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " pod="openstack/nova-metadata-0" Feb 24 03:15:47 crc kubenswrapper[4923]: I0224 03:15:47.849263 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hprzq\" (UniqueName: \"kubernetes.io/projected/4a19b08f-d8dc-4bf6-b907-f349739f12b8-kube-api-access-hprzq\") pod \"nova-metadata-0\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " pod="openstack/nova-metadata-0" Feb 24 03:15:48 crc kubenswrapper[4923]: I0224 03:15:48.007564 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 24 03:15:48 crc kubenswrapper[4923]: I0224 03:15:48.056084 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 24 03:15:48 crc kubenswrapper[4923]: W0224 03:15:48.056786 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5130427e_ca28_4060_ac80_72202959e07f.slice/crio-7180527833cc9255b08e70cd12c88ff9b5334b5cc472c3a6d8093050bc2b6f55 WatchSource:0}: Error finding container 7180527833cc9255b08e70cd12c88ff9b5334b5cc472c3a6d8093050bc2b6f55: Status 404 returned error can't find the container with id 7180527833cc9255b08e70cd12c88ff9b5334b5cc472c3a6d8093050bc2b6f55 Feb 24 03:15:48 crc kubenswrapper[4923]: I0224 03:15:48.195867 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0d1d2544-6a36-4e92-82d1-0cd1f937b362" containerName="nova-scheduler-scheduler" containerID="cri-o://9c59e84ebf0d4ef8572764e9118f99b4c8d9299ac6b6ebcdb9f759c57dd03598" gracePeriod=30 Feb 24 03:15:48 crc kubenswrapper[4923]: I0224 03:15:48.195993 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5130427e-ca28-4060-ac80-72202959e07f","Type":"ContainerStarted","Data":"7180527833cc9255b08e70cd12c88ff9b5334b5cc472c3a6d8093050bc2b6f55"} Feb 24 03:15:48 crc kubenswrapper[4923]: I0224 03:15:48.489066 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:15:48 crc kubenswrapper[4923]: W0224 03:15:48.493677 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a19b08f_d8dc_4bf6_b907_f349739f12b8.slice/crio-187c1fa63f1ea365f2deb87a195b6552ccb7915000c83246d7230f6c3f9b4d2f WatchSource:0}: Error finding container 187c1fa63f1ea365f2deb87a195b6552ccb7915000c83246d7230f6c3f9b4d2f: Status 404 returned error can't find the container with id 187c1fa63f1ea365f2deb87a195b6552ccb7915000c83246d7230f6c3f9b4d2f Feb 24 03:15:49 crc kubenswrapper[4923]: I0224 03:15:49.208336 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5130427e-ca28-4060-ac80-72202959e07f","Type":"ContainerStarted","Data":"52ed5f3607bb05668fd6c7cd0308c20c7a55fab52625b533bb43f928c95b7a38"} Feb 24 03:15:49 crc kubenswrapper[4923]: I0224 03:15:49.208938 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 24 03:15:49 crc kubenswrapper[4923]: I0224 03:15:49.210328 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a19b08f-d8dc-4bf6-b907-f349739f12b8","Type":"ContainerStarted","Data":"441a37768ae1abf50c42d470e504c43c8faad7f31a0d75393d661d917899b6a7"} Feb 24 03:15:49 crc kubenswrapper[4923]: I0224 03:15:49.210370 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a19b08f-d8dc-4bf6-b907-f349739f12b8","Type":"ContainerStarted","Data":"95c737e32791047aae4cc518655e3ee3ddf74fde33b2da6bc13c23fbc81f4a09"} Feb 24 03:15:49 crc kubenswrapper[4923]: I0224 03:15:49.210380 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a19b08f-d8dc-4bf6-b907-f349739f12b8","Type":"ContainerStarted","Data":"187c1fa63f1ea365f2deb87a195b6552ccb7915000c83246d7230f6c3f9b4d2f"} Feb 24 03:15:49 crc kubenswrapper[4923]: I0224 03:15:49.243553 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.243513694 podStartE2EDuration="2.243513694s" podCreationTimestamp="2026-02-24 03:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:15:49.226614818 +0000 UTC m=+1273.243685661" watchObservedRunningTime="2026-02-24 03:15:49.243513694 +0000 UTC m=+1273.260584597" Feb 24 03:15:49 crc kubenswrapper[4923]: I0224 03:15:49.261021 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.260987014 podStartE2EDuration="2.260987014s" podCreationTimestamp="2026-02-24 03:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:15:49.259262609 +0000 UTC m=+1273.276333522" watchObservedRunningTime="2026-02-24 03:15:49.260987014 +0000 UTC m=+1273.278057877" Feb 24 03:15:50 crc kubenswrapper[4923]: E0224 03:15:50.643551 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c59e84ebf0d4ef8572764e9118f99b4c8d9299ac6b6ebcdb9f759c57dd03598" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 24 03:15:50 crc kubenswrapper[4923]: E0224 03:15:50.646684 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c59e84ebf0d4ef8572764e9118f99b4c8d9299ac6b6ebcdb9f759c57dd03598" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 24 03:15:50 crc kubenswrapper[4923]: E0224 03:15:50.648792 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c59e84ebf0d4ef8572764e9118f99b4c8d9299ac6b6ebcdb9f759c57dd03598" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 24 03:15:50 crc kubenswrapper[4923]: E0224 03:15:50.648875 4923 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0d1d2544-6a36-4e92-82d1-0cd1f937b362" containerName="nova-scheduler-scheduler" Feb 24 03:15:51 crc kubenswrapper[4923]: I0224 03:15:51.741782 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 24 03:15:51 crc kubenswrapper[4923]: I0224 03:15:51.906784 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d1d2544-6a36-4e92-82d1-0cd1f937b362-combined-ca-bundle\") pod \"0d1d2544-6a36-4e92-82d1-0cd1f937b362\" (UID: \"0d1d2544-6a36-4e92-82d1-0cd1f937b362\") " Feb 24 03:15:51 crc kubenswrapper[4923]: I0224 03:15:51.906839 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d1d2544-6a36-4e92-82d1-0cd1f937b362-config-data\") pod \"0d1d2544-6a36-4e92-82d1-0cd1f937b362\" (UID: \"0d1d2544-6a36-4e92-82d1-0cd1f937b362\") " Feb 24 03:15:51 crc kubenswrapper[4923]: I0224 03:15:51.906874 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vw22\" (UniqueName: \"kubernetes.io/projected/0d1d2544-6a36-4e92-82d1-0cd1f937b362-kube-api-access-6vw22\") pod \"0d1d2544-6a36-4e92-82d1-0cd1f937b362\" (UID: \"0d1d2544-6a36-4e92-82d1-0cd1f937b362\") " Feb 24 03:15:51 crc kubenswrapper[4923]: I0224 03:15:51.915589 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d1d2544-6a36-4e92-82d1-0cd1f937b362-kube-api-access-6vw22" (OuterVolumeSpecName: "kube-api-access-6vw22") pod "0d1d2544-6a36-4e92-82d1-0cd1f937b362" (UID: "0d1d2544-6a36-4e92-82d1-0cd1f937b362"). InnerVolumeSpecName "kube-api-access-6vw22". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:15:51 crc kubenswrapper[4923]: I0224 03:15:51.940227 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d1d2544-6a36-4e92-82d1-0cd1f937b362-config-data" (OuterVolumeSpecName: "config-data") pod "0d1d2544-6a36-4e92-82d1-0cd1f937b362" (UID: "0d1d2544-6a36-4e92-82d1-0cd1f937b362"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:51 crc kubenswrapper[4923]: I0224 03:15:51.961819 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d1d2544-6a36-4e92-82d1-0cd1f937b362-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d1d2544-6a36-4e92-82d1-0cd1f937b362" (UID: "0d1d2544-6a36-4e92-82d1-0cd1f937b362"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.008530 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d1d2544-6a36-4e92-82d1-0cd1f937b362-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.008560 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d1d2544-6a36-4e92-82d1-0cd1f937b362-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.008569 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vw22\" (UniqueName: \"kubernetes.io/projected/0d1d2544-6a36-4e92-82d1-0cd1f937b362-kube-api-access-6vw22\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.147209 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.239912 4923 generic.go:334] "Generic (PLEG): container finished" podID="0d1d2544-6a36-4e92-82d1-0cd1f937b362" containerID="9c59e84ebf0d4ef8572764e9118f99b4c8d9299ac6b6ebcdb9f759c57dd03598" exitCode=0 Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.240061 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0d1d2544-6a36-4e92-82d1-0cd1f937b362","Type":"ContainerDied","Data":"9c59e84ebf0d4ef8572764e9118f99b4c8d9299ac6b6ebcdb9f759c57dd03598"} Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.240094 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0d1d2544-6a36-4e92-82d1-0cd1f937b362","Type":"ContainerDied","Data":"8f76b31ccb4c569882f79030c8805a86876213c3d7d974929cf38a0d28953b10"} Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.240136 4923 scope.go:117] "RemoveContainer" containerID="9c59e84ebf0d4ef8572764e9118f99b4c8d9299ac6b6ebcdb9f759c57dd03598" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.240321 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.245011 4923 generic.go:334] "Generic (PLEG): container finished" podID="5a092fee-177a-48c6-b7bf-e28195f851f6" containerID="b912e50eedb0bc6a175403fc247fe52c5e467a301b549cdd885cf3e74465a23b" exitCode=0 Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.245066 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a092fee-177a-48c6-b7bf-e28195f851f6","Type":"ContainerDied","Data":"b912e50eedb0bc6a175403fc247fe52c5e467a301b549cdd885cf3e74465a23b"} Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.245097 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a092fee-177a-48c6-b7bf-e28195f851f6","Type":"ContainerDied","Data":"82bf24e2a00058e3092e93bec5105a8c4b6c5aaca529172226976201b094af09"} Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.245163 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.270071 4923 scope.go:117] "RemoveContainer" containerID="9c59e84ebf0d4ef8572764e9118f99b4c8d9299ac6b6ebcdb9f759c57dd03598" Feb 24 03:15:52 crc kubenswrapper[4923]: E0224 03:15:52.270587 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c59e84ebf0d4ef8572764e9118f99b4c8d9299ac6b6ebcdb9f759c57dd03598\": container with ID starting with 9c59e84ebf0d4ef8572764e9118f99b4c8d9299ac6b6ebcdb9f759c57dd03598 not found: ID does not exist" containerID="9c59e84ebf0d4ef8572764e9118f99b4c8d9299ac6b6ebcdb9f759c57dd03598" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.270629 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c59e84ebf0d4ef8572764e9118f99b4c8d9299ac6b6ebcdb9f759c57dd03598"} err="failed to get container status \"9c59e84ebf0d4ef8572764e9118f99b4c8d9299ac6b6ebcdb9f759c57dd03598\": rpc error: code = NotFound desc = could not find container \"9c59e84ebf0d4ef8572764e9118f99b4c8d9299ac6b6ebcdb9f759c57dd03598\": container with ID starting with 9c59e84ebf0d4ef8572764e9118f99b4c8d9299ac6b6ebcdb9f759c57dd03598 not found: ID does not exist" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.270655 4923 scope.go:117] "RemoveContainer" containerID="b912e50eedb0bc6a175403fc247fe52c5e467a301b549cdd885cf3e74465a23b" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.287454 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.312313 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.314051 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a092fee-177a-48c6-b7bf-e28195f851f6-logs\") pod \"5a092fee-177a-48c6-b7bf-e28195f851f6\" (UID: \"5a092fee-177a-48c6-b7bf-e28195f851f6\") " Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.314139 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a092fee-177a-48c6-b7bf-e28195f851f6-config-data\") pod \"5a092fee-177a-48c6-b7bf-e28195f851f6\" (UID: \"5a092fee-177a-48c6-b7bf-e28195f851f6\") " Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.314212 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a092fee-177a-48c6-b7bf-e28195f851f6-combined-ca-bundle\") pod \"5a092fee-177a-48c6-b7bf-e28195f851f6\" (UID: \"5a092fee-177a-48c6-b7bf-e28195f851f6\") " Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.314250 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8b89\" (UniqueName: \"kubernetes.io/projected/5a092fee-177a-48c6-b7bf-e28195f851f6-kube-api-access-d8b89\") pod \"5a092fee-177a-48c6-b7bf-e28195f851f6\" (UID: \"5a092fee-177a-48c6-b7bf-e28195f851f6\") " Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.314637 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a092fee-177a-48c6-b7bf-e28195f851f6-logs" (OuterVolumeSpecName: "logs") pod "5a092fee-177a-48c6-b7bf-e28195f851f6" (UID: "5a092fee-177a-48c6-b7bf-e28195f851f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.314950 4923 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a092fee-177a-48c6-b7bf-e28195f851f6-logs\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.316173 4923 scope.go:117] "RemoveContainer" containerID="6eec95db40ca8c77cf65d0cf0826cc7097cfe9b554a7db26bb8281d453ea7521" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.324134 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 24 03:15:52 crc kubenswrapper[4923]: E0224 03:15:52.324569 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a092fee-177a-48c6-b7bf-e28195f851f6" containerName="nova-api-api" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.324591 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a092fee-177a-48c6-b7bf-e28195f851f6" containerName="nova-api-api" Feb 24 03:15:52 crc kubenswrapper[4923]: E0224 03:15:52.324613 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d1d2544-6a36-4e92-82d1-0cd1f937b362" containerName="nova-scheduler-scheduler" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.324620 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d1d2544-6a36-4e92-82d1-0cd1f937b362" containerName="nova-scheduler-scheduler" Feb 24 03:15:52 crc kubenswrapper[4923]: E0224 03:15:52.324633 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a092fee-177a-48c6-b7bf-e28195f851f6" containerName="nova-api-log" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.324640 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a092fee-177a-48c6-b7bf-e28195f851f6" containerName="nova-api-log" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.324845 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d1d2544-6a36-4e92-82d1-0cd1f937b362" containerName="nova-scheduler-scheduler" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.324881 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a092fee-177a-48c6-b7bf-e28195f851f6" containerName="nova-api-log" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.324890 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a092fee-177a-48c6-b7bf-e28195f851f6" containerName="nova-api-api" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.328340 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.331734 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a092fee-177a-48c6-b7bf-e28195f851f6-kube-api-access-d8b89" (OuterVolumeSpecName: "kube-api-access-d8b89") pod "5a092fee-177a-48c6-b7bf-e28195f851f6" (UID: "5a092fee-177a-48c6-b7bf-e28195f851f6"). InnerVolumeSpecName "kube-api-access-d8b89". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.331803 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.333464 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.344895 4923 scope.go:117] "RemoveContainer" containerID="b912e50eedb0bc6a175403fc247fe52c5e467a301b549cdd885cf3e74465a23b" Feb 24 03:15:52 crc kubenswrapper[4923]: E0224 03:15:52.345974 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b912e50eedb0bc6a175403fc247fe52c5e467a301b549cdd885cf3e74465a23b\": container with ID starting with b912e50eedb0bc6a175403fc247fe52c5e467a301b549cdd885cf3e74465a23b not found: ID does not exist" containerID="b912e50eedb0bc6a175403fc247fe52c5e467a301b549cdd885cf3e74465a23b" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.346023 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b912e50eedb0bc6a175403fc247fe52c5e467a301b549cdd885cf3e74465a23b"} err="failed to get container status \"b912e50eedb0bc6a175403fc247fe52c5e467a301b549cdd885cf3e74465a23b\": rpc error: code = NotFound desc = could not find container \"b912e50eedb0bc6a175403fc247fe52c5e467a301b549cdd885cf3e74465a23b\": container with ID starting with b912e50eedb0bc6a175403fc247fe52c5e467a301b549cdd885cf3e74465a23b not found: ID does not exist" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.346045 4923 scope.go:117] "RemoveContainer" containerID="6eec95db40ca8c77cf65d0cf0826cc7097cfe9b554a7db26bb8281d453ea7521" Feb 24 03:15:52 crc kubenswrapper[4923]: E0224 03:15:52.348170 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eec95db40ca8c77cf65d0cf0826cc7097cfe9b554a7db26bb8281d453ea7521\": container with ID starting with 6eec95db40ca8c77cf65d0cf0826cc7097cfe9b554a7db26bb8281d453ea7521 not found: ID does not exist" containerID="6eec95db40ca8c77cf65d0cf0826cc7097cfe9b554a7db26bb8281d453ea7521" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.348236 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eec95db40ca8c77cf65d0cf0826cc7097cfe9b554a7db26bb8281d453ea7521"} err="failed to get container status \"6eec95db40ca8c77cf65d0cf0826cc7097cfe9b554a7db26bb8281d453ea7521\": rpc error: code = NotFound desc = could not find container \"6eec95db40ca8c77cf65d0cf0826cc7097cfe9b554a7db26bb8281d453ea7521\": container with ID starting with 6eec95db40ca8c77cf65d0cf0826cc7097cfe9b554a7db26bb8281d453ea7521 not found: ID does not exist" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.356138 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a092fee-177a-48c6-b7bf-e28195f851f6-config-data" (OuterVolumeSpecName: "config-data") pod "5a092fee-177a-48c6-b7bf-e28195f851f6" (UID: "5a092fee-177a-48c6-b7bf-e28195f851f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.360488 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a092fee-177a-48c6-b7bf-e28195f851f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a092fee-177a-48c6-b7bf-e28195f851f6" (UID: "5a092fee-177a-48c6-b7bf-e28195f851f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.417036 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a092fee-177a-48c6-b7bf-e28195f851f6-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.417071 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a092fee-177a-48c6-b7bf-e28195f851f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.417081 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8b89\" (UniqueName: \"kubernetes.io/projected/5a092fee-177a-48c6-b7bf-e28195f851f6-kube-api-access-d8b89\") on node \"crc\" DevicePath \"\"" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.518722 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsmsm\" (UniqueName: \"kubernetes.io/projected/21bbf39b-7001-4721-a0d4-95d2f1d523e9-kube-api-access-nsmsm\") pod \"nova-scheduler-0\" (UID: \"21bbf39b-7001-4721-a0d4-95d2f1d523e9\") " pod="openstack/nova-scheduler-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.518787 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21bbf39b-7001-4721-a0d4-95d2f1d523e9-config-data\") pod \"nova-scheduler-0\" (UID: \"21bbf39b-7001-4721-a0d4-95d2f1d523e9\") " pod="openstack/nova-scheduler-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.518848 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21bbf39b-7001-4721-a0d4-95d2f1d523e9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"21bbf39b-7001-4721-a0d4-95d2f1d523e9\") " pod="openstack/nova-scheduler-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.579650 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.598339 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.613808 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.615869 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.618186 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.620050 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsmsm\" (UniqueName: \"kubernetes.io/projected/21bbf39b-7001-4721-a0d4-95d2f1d523e9-kube-api-access-nsmsm\") pod \"nova-scheduler-0\" (UID: \"21bbf39b-7001-4721-a0d4-95d2f1d523e9\") " pod="openstack/nova-scheduler-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.620156 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21bbf39b-7001-4721-a0d4-95d2f1d523e9-config-data\") pod \"nova-scheduler-0\" (UID: \"21bbf39b-7001-4721-a0d4-95d2f1d523e9\") " pod="openstack/nova-scheduler-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.620199 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21bbf39b-7001-4721-a0d4-95d2f1d523e9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"21bbf39b-7001-4721-a0d4-95d2f1d523e9\") " pod="openstack/nova-scheduler-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.624536 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21bbf39b-7001-4721-a0d4-95d2f1d523e9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"21bbf39b-7001-4721-a0d4-95d2f1d523e9\") " pod="openstack/nova-scheduler-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.646257 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21bbf39b-7001-4721-a0d4-95d2f1d523e9-config-data\") pod \"nova-scheduler-0\" (UID: \"21bbf39b-7001-4721-a0d4-95d2f1d523e9\") " pod="openstack/nova-scheduler-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.650993 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsmsm\" (UniqueName: \"kubernetes.io/projected/21bbf39b-7001-4721-a0d4-95d2f1d523e9-kube-api-access-nsmsm\") pod \"nova-scheduler-0\" (UID: \"21bbf39b-7001-4721-a0d4-95d2f1d523e9\") " pod="openstack/nova-scheduler-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.653614 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.657056 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.742122 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8866591-1753-454f-b225-485f9695f0a6-logs\") pod \"nova-api-0\" (UID: \"c8866591-1753-454f-b225-485f9695f0a6\") " pod="openstack/nova-api-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.742660 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8866591-1753-454f-b225-485f9695f0a6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c8866591-1753-454f-b225-485f9695f0a6\") " pod="openstack/nova-api-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.742779 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8866591-1753-454f-b225-485f9695f0a6-config-data\") pod \"nova-api-0\" (UID: \"c8866591-1753-454f-b225-485f9695f0a6\") " pod="openstack/nova-api-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.742834 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp8w7\" (UniqueName: \"kubernetes.io/projected/c8866591-1753-454f-b225-485f9695f0a6-kube-api-access-rp8w7\") pod \"nova-api-0\" (UID: \"c8866591-1753-454f-b225-485f9695f0a6\") " pod="openstack/nova-api-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.844742 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8866591-1753-454f-b225-485f9695f0a6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c8866591-1753-454f-b225-485f9695f0a6\") " pod="openstack/nova-api-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.844799 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8866591-1753-454f-b225-485f9695f0a6-config-data\") pod \"nova-api-0\" (UID: \"c8866591-1753-454f-b225-485f9695f0a6\") " pod="openstack/nova-api-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.844828 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp8w7\" (UniqueName: \"kubernetes.io/projected/c8866591-1753-454f-b225-485f9695f0a6-kube-api-access-rp8w7\") pod \"nova-api-0\" (UID: \"c8866591-1753-454f-b225-485f9695f0a6\") " pod="openstack/nova-api-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.844949 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8866591-1753-454f-b225-485f9695f0a6-logs\") pod \"nova-api-0\" (UID: \"c8866591-1753-454f-b225-485f9695f0a6\") " pod="openstack/nova-api-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.845335 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8866591-1753-454f-b225-485f9695f0a6-logs\") pod \"nova-api-0\" (UID: \"c8866591-1753-454f-b225-485f9695f0a6\") " pod="openstack/nova-api-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.850024 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8866591-1753-454f-b225-485f9695f0a6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c8866591-1753-454f-b225-485f9695f0a6\") " pod="openstack/nova-api-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.851224 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8866591-1753-454f-b225-485f9695f0a6-config-data\") pod \"nova-api-0\" (UID: \"c8866591-1753-454f-b225-485f9695f0a6\") " pod="openstack/nova-api-0" Feb 24 03:15:52 crc kubenswrapper[4923]: I0224 03:15:52.863010 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp8w7\" (UniqueName: \"kubernetes.io/projected/c8866591-1753-454f-b225-485f9695f0a6-kube-api-access-rp8w7\") pod \"nova-api-0\" (UID: \"c8866591-1753-454f-b225-485f9695f0a6\") " pod="openstack/nova-api-0" Feb 24 03:15:53 crc kubenswrapper[4923]: I0224 03:15:53.009408 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 24 03:15:53 crc kubenswrapper[4923]: I0224 03:15:53.009718 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 24 03:15:53 crc kubenswrapper[4923]: I0224 03:15:53.045598 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 24 03:15:53 crc kubenswrapper[4923]: I0224 03:15:53.085958 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 24 03:15:53 crc kubenswrapper[4923]: I0224 03:15:53.259252 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"21bbf39b-7001-4721-a0d4-95d2f1d523e9","Type":"ContainerStarted","Data":"39bfe6d87e7fe2135caaefc1fde9086729590d8a6a88398ecbdd71c63fe3ecad"} Feb 24 03:15:53 crc kubenswrapper[4923]: W0224 03:15:53.504225 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8866591_1753_454f_b225_485f9695f0a6.slice/crio-e269ca2aa33b755ddb313439f2a034afb13f4d9007ebc27845e103c7bf57887a WatchSource:0}: Error finding container e269ca2aa33b755ddb313439f2a034afb13f4d9007ebc27845e103c7bf57887a: Status 404 returned error can't find the container with id e269ca2aa33b755ddb313439f2a034afb13f4d9007ebc27845e103c7bf57887a Feb 24 03:15:53 crc kubenswrapper[4923]: I0224 03:15:53.506449 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 24 03:15:53 crc kubenswrapper[4923]: I0224 03:15:53.744246 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d1d2544-6a36-4e92-82d1-0cd1f937b362" path="/var/lib/kubelet/pods/0d1d2544-6a36-4e92-82d1-0cd1f937b362/volumes" Feb 24 03:15:53 crc kubenswrapper[4923]: I0224 03:15:53.746719 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a092fee-177a-48c6-b7bf-e28195f851f6" path="/var/lib/kubelet/pods/5a092fee-177a-48c6-b7bf-e28195f851f6/volumes" Feb 24 03:15:54 crc kubenswrapper[4923]: I0224 03:15:54.273424 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c8866591-1753-454f-b225-485f9695f0a6","Type":"ContainerStarted","Data":"9403133054e2da93cb98036c315f63591ebe925fe466b3ea3e1e17c105509da6"} Feb 24 03:15:54 crc kubenswrapper[4923]: I0224 03:15:54.273611 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c8866591-1753-454f-b225-485f9695f0a6","Type":"ContainerStarted","Data":"19a51a503fc4dd3138f85e59dbeff5754a7f0f1b4b122bbbcee3b761f1598753"} Feb 24 03:15:54 crc kubenswrapper[4923]: I0224 03:15:54.273626 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c8866591-1753-454f-b225-485f9695f0a6","Type":"ContainerStarted","Data":"e269ca2aa33b755ddb313439f2a034afb13f4d9007ebc27845e103c7bf57887a"} Feb 24 03:15:54 crc kubenswrapper[4923]: I0224 03:15:54.275770 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"21bbf39b-7001-4721-a0d4-95d2f1d523e9","Type":"ContainerStarted","Data":"afe1bc40b79046935e2a3157e103afafcb0a59d861e2fc1a06f04294a34ac01a"} Feb 24 03:15:54 crc kubenswrapper[4923]: I0224 03:15:54.300322 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.300283873 podStartE2EDuration="2.300283873s" podCreationTimestamp="2026-02-24 03:15:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:15:54.290724716 +0000 UTC m=+1278.307795519" watchObservedRunningTime="2026-02-24 03:15:54.300283873 +0000 UTC m=+1278.317354686" Feb 24 03:15:54 crc kubenswrapper[4923]: I0224 03:15:54.311686 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.311669656 podStartE2EDuration="2.311669656s" podCreationTimestamp="2026-02-24 03:15:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:15:54.306852562 +0000 UTC m=+1278.323923375" watchObservedRunningTime="2026-02-24 03:15:54.311669656 +0000 UTC m=+1278.328740469" Feb 24 03:15:57 crc kubenswrapper[4923]: I0224 03:15:57.625847 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 24 03:15:57 crc kubenswrapper[4923]: I0224 03:15:57.657492 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 24 03:15:58 crc kubenswrapper[4923]: I0224 03:15:58.008613 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 24 03:15:58 crc kubenswrapper[4923]: I0224 03:15:58.008927 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 24 03:15:59 crc kubenswrapper[4923]: I0224 03:15:59.026559 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4a19b08f-d8dc-4bf6-b907-f349739f12b8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 24 03:15:59 crc kubenswrapper[4923]: I0224 03:15:59.026568 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4a19b08f-d8dc-4bf6-b907-f349739f12b8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 24 03:16:02 crc kubenswrapper[4923]: I0224 03:16:02.658211 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 24 03:16:02 crc kubenswrapper[4923]: I0224 03:16:02.691543 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 24 03:16:03 crc kubenswrapper[4923]: I0224 03:16:03.046345 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 24 03:16:03 crc kubenswrapper[4923]: I0224 03:16:03.046697 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 24 03:16:03 crc kubenswrapper[4923]: I0224 03:16:03.405569 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 24 03:16:04 crc kubenswrapper[4923]: I0224 03:16:04.129613 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c8866591-1753-454f-b225-485f9695f0a6" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 24 03:16:04 crc kubenswrapper[4923]: I0224 03:16:04.129635 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c8866591-1753-454f-b225-485f9695f0a6" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 24 03:16:08 crc kubenswrapper[4923]: I0224 03:16:08.014686 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 24 03:16:08 crc kubenswrapper[4923]: I0224 03:16:08.019886 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 24 03:16:08 crc kubenswrapper[4923]: I0224 03:16:08.022850 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 24 03:16:08 crc kubenswrapper[4923]: I0224 03:16:08.422400 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.435421 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.439342 4923 generic.go:334] "Generic (PLEG): container finished" podID="d073d59f-dbca-42a7-b04e-9efcb0f2dbfe" containerID="bcb89977dc05fb533bced30b23bf6f40d0da77a4e98962bd0c596204d2830aa6" exitCode=137 Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.439780 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.440234 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe","Type":"ContainerDied","Data":"bcb89977dc05fb533bced30b23bf6f40d0da77a4e98962bd0c596204d2830aa6"} Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.440287 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe","Type":"ContainerDied","Data":"f40360b76f3e2c254da69b7cb4b686868237bd72483c0920ad9ae14a250be33f"} Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.440326 4923 scope.go:117] "RemoveContainer" containerID="bcb89977dc05fb533bced30b23bf6f40d0da77a4e98962bd0c596204d2830aa6" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.481749 4923 scope.go:117] "RemoveContainer" containerID="bcb89977dc05fb533bced30b23bf6f40d0da77a4e98962bd0c596204d2830aa6" Feb 24 03:16:10 crc kubenswrapper[4923]: E0224 03:16:10.482246 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcb89977dc05fb533bced30b23bf6f40d0da77a4e98962bd0c596204d2830aa6\": container with ID starting with bcb89977dc05fb533bced30b23bf6f40d0da77a4e98962bd0c596204d2830aa6 not found: ID does not exist" containerID="bcb89977dc05fb533bced30b23bf6f40d0da77a4e98962bd0c596204d2830aa6" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.482319 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcb89977dc05fb533bced30b23bf6f40d0da77a4e98962bd0c596204d2830aa6"} err="failed to get container status \"bcb89977dc05fb533bced30b23bf6f40d0da77a4e98962bd0c596204d2830aa6\": rpc error: code = NotFound desc = could not find container \"bcb89977dc05fb533bced30b23bf6f40d0da77a4e98962bd0c596204d2830aa6\": container with ID starting with bcb89977dc05fb533bced30b23bf6f40d0da77a4e98962bd0c596204d2830aa6 not found: ID does not exist" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.588719 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe-combined-ca-bundle\") pod \"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe\" (UID: \"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe\") " Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.588894 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe-config-data\") pod \"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe\" (UID: \"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe\") " Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.588957 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn849\" (UniqueName: \"kubernetes.io/projected/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe-kube-api-access-jn849\") pod \"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe\" (UID: \"d073d59f-dbca-42a7-b04e-9efcb0f2dbfe\") " Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.599628 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe-kube-api-access-jn849" (OuterVolumeSpecName: "kube-api-access-jn849") pod "d073d59f-dbca-42a7-b04e-9efcb0f2dbfe" (UID: "d073d59f-dbca-42a7-b04e-9efcb0f2dbfe"). InnerVolumeSpecName "kube-api-access-jn849". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.614827 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe-config-data" (OuterVolumeSpecName: "config-data") pod "d073d59f-dbca-42a7-b04e-9efcb0f2dbfe" (UID: "d073d59f-dbca-42a7-b04e-9efcb0f2dbfe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.617445 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d073d59f-dbca-42a7-b04e-9efcb0f2dbfe" (UID: "d073d59f-dbca-42a7-b04e-9efcb0f2dbfe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.690839 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.691149 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.691162 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn849\" (UniqueName: \"kubernetes.io/projected/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe-kube-api-access-jn849\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.783158 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.794458 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.813156 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 24 03:16:10 crc kubenswrapper[4923]: E0224 03:16:10.813962 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d073d59f-dbca-42a7-b04e-9efcb0f2dbfe" containerName="nova-cell1-novncproxy-novncproxy" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.814072 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="d073d59f-dbca-42a7-b04e-9efcb0f2dbfe" containerName="nova-cell1-novncproxy-novncproxy" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.814467 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="d073d59f-dbca-42a7-b04e-9efcb0f2dbfe" containerName="nova-cell1-novncproxy-novncproxy" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.815356 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.818186 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.818466 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.819151 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.834171 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.995270 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb4dn\" (UniqueName: \"kubernetes.io/projected/7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3-kube-api-access-jb4dn\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.995325 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.995353 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.995469 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:10 crc kubenswrapper[4923]: I0224 03:16:10.995519 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:11 crc kubenswrapper[4923]: I0224 03:16:11.097071 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb4dn\" (UniqueName: \"kubernetes.io/projected/7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3-kube-api-access-jb4dn\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:11 crc kubenswrapper[4923]: I0224 03:16:11.097111 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:11 crc kubenswrapper[4923]: I0224 03:16:11.097136 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:11 crc kubenswrapper[4923]: I0224 03:16:11.097864 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:11 crc kubenswrapper[4923]: I0224 03:16:11.097913 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:11 crc kubenswrapper[4923]: I0224 03:16:11.101939 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:11 crc kubenswrapper[4923]: I0224 03:16:11.102187 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:11 crc kubenswrapper[4923]: I0224 03:16:11.103139 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:11 crc kubenswrapper[4923]: I0224 03:16:11.109833 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:11 crc kubenswrapper[4923]: I0224 03:16:11.118377 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb4dn\" (UniqueName: \"kubernetes.io/projected/7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3-kube-api-access-jb4dn\") pod \"nova-cell1-novncproxy-0\" (UID: \"7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:11 crc kubenswrapper[4923]: I0224 03:16:11.137362 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:11 crc kubenswrapper[4923]: I0224 03:16:11.615170 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 24 03:16:11 crc kubenswrapper[4923]: I0224 03:16:11.723335 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d073d59f-dbca-42a7-b04e-9efcb0f2dbfe" path="/var/lib/kubelet/pods/d073d59f-dbca-42a7-b04e-9efcb0f2dbfe/volumes" Feb 24 03:16:12 crc kubenswrapper[4923]: I0224 03:16:12.464143 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3","Type":"ContainerStarted","Data":"f8a37513bc747c40953f2cac632f33a6798d856f9bf5742c847c6c672452db4c"} Feb 24 03:16:12 crc kubenswrapper[4923]: I0224 03:16:12.464497 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3","Type":"ContainerStarted","Data":"deba28dcfa5747d48ed3dfdf650c2ab02081cad6d880c6cb54e25528ff9d973f"} Feb 24 03:16:12 crc kubenswrapper[4923]: I0224 03:16:12.516391 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.5163653139999997 podStartE2EDuration="2.516365314s" podCreationTimestamp="2026-02-24 03:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:16:12.499488409 +0000 UTC m=+1296.516559232" watchObservedRunningTime="2026-02-24 03:16:12.516365314 +0000 UTC m=+1296.533436157" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.053713 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.055714 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.059438 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.060964 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.475920 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.479529 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.664586 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-g57rb"] Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.671065 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.697104 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-g57rb"] Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.856223 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4trb\" (UniqueName: \"kubernetes.io/projected/d73557bd-a940-406b-b9bb-9f43f09b5f77-kube-api-access-g4trb\") pod \"dnsmasq-dns-cd5cbd7b9-g57rb\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.856304 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-g57rb\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.856339 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-config\") pod \"dnsmasq-dns-cd5cbd7b9-g57rb\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.856580 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-g57rb\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.856705 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-g57rb\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.856771 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-g57rb\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.958877 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-config\") pod \"dnsmasq-dns-cd5cbd7b9-g57rb\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.958966 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-g57rb\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.959009 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-g57rb\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.959038 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-g57rb\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.959102 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4trb\" (UniqueName: \"kubernetes.io/projected/d73557bd-a940-406b-b9bb-9f43f09b5f77-kube-api-access-g4trb\") pod \"dnsmasq-dns-cd5cbd7b9-g57rb\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.959135 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-g57rb\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.960062 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-g57rb\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.960162 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-g57rb\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.960246 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-g57rb\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.960324 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-config\") pod \"dnsmasq-dns-cd5cbd7b9-g57rb\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.960423 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-g57rb\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:13 crc kubenswrapper[4923]: I0224 03:16:13.983793 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4trb\" (UniqueName: \"kubernetes.io/projected/d73557bd-a940-406b-b9bb-9f43f09b5f77-kube-api-access-g4trb\") pod \"dnsmasq-dns-cd5cbd7b9-g57rb\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:14 crc kubenswrapper[4923]: I0224 03:16:14.012053 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:14 crc kubenswrapper[4923]: W0224 03:16:14.732353 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd73557bd_a940_406b_b9bb_9f43f09b5f77.slice/crio-a83cc4666013b127452a26ddbdfb69a798af4e10808c99f9d042f360066015d1 WatchSource:0}: Error finding container a83cc4666013b127452a26ddbdfb69a798af4e10808c99f9d042f360066015d1: Status 404 returned error can't find the container with id a83cc4666013b127452a26ddbdfb69a798af4e10808c99f9d042f360066015d1 Feb 24 03:16:14 crc kubenswrapper[4923]: I0224 03:16:14.739592 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-g57rb"] Feb 24 03:16:15 crc kubenswrapper[4923]: I0224 03:16:15.501422 4923 generic.go:334] "Generic (PLEG): container finished" podID="d73557bd-a940-406b-b9bb-9f43f09b5f77" containerID="b5b65b2aad6f39e5bd62f053d06cf3229e3eb555413a3b8e976799399df0bb36" exitCode=0 Feb 24 03:16:15 crc kubenswrapper[4923]: I0224 03:16:15.501601 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" event={"ID":"d73557bd-a940-406b-b9bb-9f43f09b5f77","Type":"ContainerDied","Data":"b5b65b2aad6f39e5bd62f053d06cf3229e3eb555413a3b8e976799399df0bb36"} Feb 24 03:16:15 crc kubenswrapper[4923]: I0224 03:16:15.501856 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" event={"ID":"d73557bd-a940-406b-b9bb-9f43f09b5f77","Type":"ContainerStarted","Data":"a83cc4666013b127452a26ddbdfb69a798af4e10808c99f9d042f360066015d1"} Feb 24 03:16:15 crc kubenswrapper[4923]: I0224 03:16:15.836870 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:16:15 crc kubenswrapper[4923]: I0224 03:16:15.838506 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerName="ceilometer-central-agent" containerID="cri-o://33c265621497e15b63b49367e1e4f35deab97b0b1060f46f24f73796bda2e128" gracePeriod=30 Feb 24 03:16:15 crc kubenswrapper[4923]: I0224 03:16:15.838567 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerName="proxy-httpd" containerID="cri-o://91d81246b73c0225957b1bd2d65d5fe5228e087a38e9f21a5abb232973036ef8" gracePeriod=30 Feb 24 03:16:15 crc kubenswrapper[4923]: I0224 03:16:15.838579 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerName="sg-core" containerID="cri-o://07c7c699b11ffa84fc823982a34336df0a9616748438f269f4aa71442da27352" gracePeriod=30 Feb 24 03:16:15 crc kubenswrapper[4923]: I0224 03:16:15.838606 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerName="ceilometer-notification-agent" containerID="cri-o://1186ac07f85c636118a322753084290e3301c3e363a6f4b7ef454e061037f398" gracePeriod=30 Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.137685 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.310219 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.512135 4923 generic.go:334] "Generic (PLEG): container finished" podID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerID="91d81246b73c0225957b1bd2d65d5fe5228e087a38e9f21a5abb232973036ef8" exitCode=0 Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.512165 4923 generic.go:334] "Generic (PLEG): container finished" podID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerID="07c7c699b11ffa84fc823982a34336df0a9616748438f269f4aa71442da27352" exitCode=2 Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.512174 4923 generic.go:334] "Generic (PLEG): container finished" podID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerID="1186ac07f85c636118a322753084290e3301c3e363a6f4b7ef454e061037f398" exitCode=0 Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.512182 4923 generic.go:334] "Generic (PLEG): container finished" podID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerID="33c265621497e15b63b49367e1e4f35deab97b0b1060f46f24f73796bda2e128" exitCode=0 Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.512224 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"627ea57c-55e7-43fc-ab33-51ab2e7a7e80","Type":"ContainerDied","Data":"91d81246b73c0225957b1bd2d65d5fe5228e087a38e9f21a5abb232973036ef8"} Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.512254 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"627ea57c-55e7-43fc-ab33-51ab2e7a7e80","Type":"ContainerDied","Data":"07c7c699b11ffa84fc823982a34336df0a9616748438f269f4aa71442da27352"} Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.512266 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"627ea57c-55e7-43fc-ab33-51ab2e7a7e80","Type":"ContainerDied","Data":"1186ac07f85c636118a322753084290e3301c3e363a6f4b7ef454e061037f398"} Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.512277 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"627ea57c-55e7-43fc-ab33-51ab2e7a7e80","Type":"ContainerDied","Data":"33c265621497e15b63b49367e1e4f35deab97b0b1060f46f24f73796bda2e128"} Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.546840 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c8866591-1753-454f-b225-485f9695f0a6" containerName="nova-api-log" containerID="cri-o://19a51a503fc4dd3138f85e59dbeff5754a7f0f1b4b122bbbcee3b761f1598753" gracePeriod=30 Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.547342 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" event={"ID":"d73557bd-a940-406b-b9bb-9f43f09b5f77","Type":"ContainerStarted","Data":"a119812fcbffa0f749bbf936b9908d146d0c37b20e201a1fdebcbb7055d20dfe"} Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.548719 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c8866591-1753-454f-b225-485f9695f0a6" containerName="nova-api-api" containerID="cri-o://9403133054e2da93cb98036c315f63591ebe925fe466b3ea3e1e17c105509da6" gracePeriod=30 Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.548979 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.582762 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" podStartSLOduration=3.582745981 podStartE2EDuration="3.582745981s" podCreationTimestamp="2026-02-24 03:16:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:16:16.57999569 +0000 UTC m=+1300.597066513" watchObservedRunningTime="2026-02-24 03:16:16.582745981 +0000 UTC m=+1300.599816794" Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.702331 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.813492 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-run-httpd\") pod \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.813545 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-scripts\") pod \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.813616 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nspm\" (UniqueName: \"kubernetes.io/projected/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-kube-api-access-4nspm\") pod \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.813672 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-config-data\") pod \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.813775 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-sg-core-conf-yaml\") pod \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.813791 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-log-httpd\") pod \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.813840 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-ceilometer-tls-certs\") pod \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.813875 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-combined-ca-bundle\") pod \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\" (UID: \"627ea57c-55e7-43fc-ab33-51ab2e7a7e80\") " Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.814844 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "627ea57c-55e7-43fc-ab33-51ab2e7a7e80" (UID: "627ea57c-55e7-43fc-ab33-51ab2e7a7e80"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.816769 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "627ea57c-55e7-43fc-ab33-51ab2e7a7e80" (UID: "627ea57c-55e7-43fc-ab33-51ab2e7a7e80"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.821634 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-kube-api-access-4nspm" (OuterVolumeSpecName: "kube-api-access-4nspm") pod "627ea57c-55e7-43fc-ab33-51ab2e7a7e80" (UID: "627ea57c-55e7-43fc-ab33-51ab2e7a7e80"). InnerVolumeSpecName "kube-api-access-4nspm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.822383 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-scripts" (OuterVolumeSpecName: "scripts") pod "627ea57c-55e7-43fc-ab33-51ab2e7a7e80" (UID: "627ea57c-55e7-43fc-ab33-51ab2e7a7e80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.846646 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "627ea57c-55e7-43fc-ab33-51ab2e7a7e80" (UID: "627ea57c-55e7-43fc-ab33-51ab2e7a7e80"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.868042 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "627ea57c-55e7-43fc-ab33-51ab2e7a7e80" (UID: "627ea57c-55e7-43fc-ab33-51ab2e7a7e80"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.910451 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "627ea57c-55e7-43fc-ab33-51ab2e7a7e80" (UID: "627ea57c-55e7-43fc-ab33-51ab2e7a7e80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.915545 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nspm\" (UniqueName: \"kubernetes.io/projected/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-kube-api-access-4nspm\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.915574 4923 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.915583 4923 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.915592 4923 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.915600 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.915609 4923 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.915617 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:16 crc kubenswrapper[4923]: I0224 03:16:16.937461 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-config-data" (OuterVolumeSpecName: "config-data") pod "627ea57c-55e7-43fc-ab33-51ab2e7a7e80" (UID: "627ea57c-55e7-43fc-ab33-51ab2e7a7e80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.017321 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627ea57c-55e7-43fc-ab33-51ab2e7a7e80-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.557473 4923 generic.go:334] "Generic (PLEG): container finished" podID="c8866591-1753-454f-b225-485f9695f0a6" containerID="19a51a503fc4dd3138f85e59dbeff5754a7f0f1b4b122bbbcee3b761f1598753" exitCode=143 Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.557671 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c8866591-1753-454f-b225-485f9695f0a6","Type":"ContainerDied","Data":"19a51a503fc4dd3138f85e59dbeff5754a7f0f1b4b122bbbcee3b761f1598753"} Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.561420 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.561444 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"627ea57c-55e7-43fc-ab33-51ab2e7a7e80","Type":"ContainerDied","Data":"01c44f6622c885c0d0829cb871c133986f8ee3b7d834a544b6500ac29868c7eb"} Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.561532 4923 scope.go:117] "RemoveContainer" containerID="91d81246b73c0225957b1bd2d65d5fe5228e087a38e9f21a5abb232973036ef8" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.601667 4923 scope.go:117] "RemoveContainer" containerID="07c7c699b11ffa84fc823982a34336df0a9616748438f269f4aa71442da27352" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.602075 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.615818 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.625652 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:16:17 crc kubenswrapper[4923]: E0224 03:16:17.627054 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerName="proxy-httpd" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.627071 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerName="proxy-httpd" Feb 24 03:16:17 crc kubenswrapper[4923]: E0224 03:16:17.627089 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerName="sg-core" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.627095 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerName="sg-core" Feb 24 03:16:17 crc kubenswrapper[4923]: E0224 03:16:17.627108 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerName="ceilometer-notification-agent" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.627116 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerName="ceilometer-notification-agent" Feb 24 03:16:17 crc kubenswrapper[4923]: E0224 03:16:17.627131 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerName="ceilometer-central-agent" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.627140 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerName="ceilometer-central-agent" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.627364 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerName="sg-core" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.627375 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerName="ceilometer-central-agent" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.627388 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerName="proxy-httpd" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.627398 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" containerName="ceilometer-notification-agent" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.629066 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.634764 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.634764 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.634882 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.654150 4923 scope.go:117] "RemoveContainer" containerID="1186ac07f85c636118a322753084290e3301c3e363a6f4b7ef454e061037f398" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.657708 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.686864 4923 scope.go:117] "RemoveContainer" containerID="33c265621497e15b63b49367e1e4f35deab97b0b1060f46f24f73796bda2e128" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.724442 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="627ea57c-55e7-43fc-ab33-51ab2e7a7e80" path="/var/lib/kubelet/pods/627ea57c-55e7-43fc-ab33-51ab2e7a7e80/volumes" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.728543 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.728571 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.728595 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-run-httpd\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.728718 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-scripts\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.728849 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.728983 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-config-data\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.729039 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-log-httpd\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.729082 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzhc5\" (UniqueName: \"kubernetes.io/projected/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-kube-api-access-bzhc5\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.831871 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzhc5\" (UniqueName: \"kubernetes.io/projected/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-kube-api-access-bzhc5\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.835478 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.835529 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.835586 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-run-httpd\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.835661 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-scripts\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.836180 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-run-httpd\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.836678 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.836849 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-config-data\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.836897 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-log-httpd\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.837352 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-log-httpd\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.840176 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.840485 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.840489 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-scripts\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.843216 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.851675 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-config-data\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.852611 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzhc5\" (UniqueName: \"kubernetes.io/projected/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-kube-api-access-bzhc5\") pod \"ceilometer-0\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " pod="openstack/ceilometer-0" Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.884491 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:16:17 crc kubenswrapper[4923]: I0224 03:16:17.885224 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:16:18 crc kubenswrapper[4923]: I0224 03:16:18.399748 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:16:18 crc kubenswrapper[4923]: W0224 03:16:18.401270 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51492e23_21d6_46a9_a8e6_2e7cf4063c0a.slice/crio-cdd405c4d69769045ba7c5c247706d9be527c218c0c970c84450c10c0500e267 WatchSource:0}: Error finding container cdd405c4d69769045ba7c5c247706d9be527c218c0c970c84450c10c0500e267: Status 404 returned error can't find the container with id cdd405c4d69769045ba7c5c247706d9be527c218c0c970c84450c10c0500e267 Feb 24 03:16:18 crc kubenswrapper[4923]: I0224 03:16:18.571803 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51492e23-21d6-46a9-a8e6-2e7cf4063c0a","Type":"ContainerStarted","Data":"cdd405c4d69769045ba7c5c247706d9be527c218c0c970c84450c10c0500e267"} Feb 24 03:16:19 crc kubenswrapper[4923]: I0224 03:16:19.592263 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51492e23-21d6-46a9-a8e6-2e7cf4063c0a","Type":"ContainerStarted","Data":"313a8eff2b254e4ecb5a45bf7b7604b2961a7f818e1cbbb39ffacdd7870f91e6"} Feb 24 03:16:19 crc kubenswrapper[4923]: I0224 03:16:19.917670 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:16:19 crc kubenswrapper[4923]: I0224 03:16:19.917972 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.123655 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.177168 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8866591-1753-454f-b225-485f9695f0a6-combined-ca-bundle\") pod \"c8866591-1753-454f-b225-485f9695f0a6\" (UID: \"c8866591-1753-454f-b225-485f9695f0a6\") " Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.177245 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8866591-1753-454f-b225-485f9695f0a6-logs\") pod \"c8866591-1753-454f-b225-485f9695f0a6\" (UID: \"c8866591-1753-454f-b225-485f9695f0a6\") " Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.177517 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp8w7\" (UniqueName: \"kubernetes.io/projected/c8866591-1753-454f-b225-485f9695f0a6-kube-api-access-rp8w7\") pod \"c8866591-1753-454f-b225-485f9695f0a6\" (UID: \"c8866591-1753-454f-b225-485f9695f0a6\") " Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.177611 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8866591-1753-454f-b225-485f9695f0a6-config-data\") pod \"c8866591-1753-454f-b225-485f9695f0a6\" (UID: \"c8866591-1753-454f-b225-485f9695f0a6\") " Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.178490 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8866591-1753-454f-b225-485f9695f0a6-logs" (OuterVolumeSpecName: "logs") pod "c8866591-1753-454f-b225-485f9695f0a6" (UID: "c8866591-1753-454f-b225-485f9695f0a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.183601 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8866591-1753-454f-b225-485f9695f0a6-kube-api-access-rp8w7" (OuterVolumeSpecName: "kube-api-access-rp8w7") pod "c8866591-1753-454f-b225-485f9695f0a6" (UID: "c8866591-1753-454f-b225-485f9695f0a6"). InnerVolumeSpecName "kube-api-access-rp8w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.216186 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8866591-1753-454f-b225-485f9695f0a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8866591-1753-454f-b225-485f9695f0a6" (UID: "c8866591-1753-454f-b225-485f9695f0a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.221833 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8866591-1753-454f-b225-485f9695f0a6-config-data" (OuterVolumeSpecName: "config-data") pod "c8866591-1753-454f-b225-485f9695f0a6" (UID: "c8866591-1753-454f-b225-485f9695f0a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.279968 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8866591-1753-454f-b225-485f9695f0a6-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.279992 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8866591-1753-454f-b225-485f9695f0a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.280001 4923 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8866591-1753-454f-b225-485f9695f0a6-logs\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.280010 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp8w7\" (UniqueName: \"kubernetes.io/projected/c8866591-1753-454f-b225-485f9695f0a6-kube-api-access-rp8w7\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.613008 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51492e23-21d6-46a9-a8e6-2e7cf4063c0a","Type":"ContainerStarted","Data":"d3a50774d794867957f0d42d90af148d477d95804ece44adcc1f09e19dc86bdd"} Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.613061 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51492e23-21d6-46a9-a8e6-2e7cf4063c0a","Type":"ContainerStarted","Data":"fdbea0ca2a8c7ed41bd9ce9c25f1c6193b5df559162cdc83e9205a51a22b0e81"} Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.615058 4923 generic.go:334] "Generic (PLEG): container finished" podID="c8866591-1753-454f-b225-485f9695f0a6" containerID="9403133054e2da93cb98036c315f63591ebe925fe466b3ea3e1e17c105509da6" exitCode=0 Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.615094 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c8866591-1753-454f-b225-485f9695f0a6","Type":"ContainerDied","Data":"9403133054e2da93cb98036c315f63591ebe925fe466b3ea3e1e17c105509da6"} Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.615115 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c8866591-1753-454f-b225-485f9695f0a6","Type":"ContainerDied","Data":"e269ca2aa33b755ddb313439f2a034afb13f4d9007ebc27845e103c7bf57887a"} Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.615124 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.615132 4923 scope.go:117] "RemoveContainer" containerID="9403133054e2da93cb98036c315f63591ebe925fe466b3ea3e1e17c105509da6" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.660089 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.680811 4923 scope.go:117] "RemoveContainer" containerID="19a51a503fc4dd3138f85e59dbeff5754a7f0f1b4b122bbbcee3b761f1598753" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.681425 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.702593 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 24 03:16:20 crc kubenswrapper[4923]: E0224 03:16:20.703069 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8866591-1753-454f-b225-485f9695f0a6" containerName="nova-api-log" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.703091 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8866591-1753-454f-b225-485f9695f0a6" containerName="nova-api-log" Feb 24 03:16:20 crc kubenswrapper[4923]: E0224 03:16:20.703125 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8866591-1753-454f-b225-485f9695f0a6" containerName="nova-api-api" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.703133 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8866591-1753-454f-b225-485f9695f0a6" containerName="nova-api-api" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.703378 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8866591-1753-454f-b225-485f9695f0a6" containerName="nova-api-log" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.703411 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8866591-1753-454f-b225-485f9695f0a6" containerName="nova-api-api" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.704590 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.707460 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.707762 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.708087 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.713976 4923 scope.go:117] "RemoveContainer" containerID="9403133054e2da93cb98036c315f63591ebe925fe466b3ea3e1e17c105509da6" Feb 24 03:16:20 crc kubenswrapper[4923]: E0224 03:16:20.715604 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9403133054e2da93cb98036c315f63591ebe925fe466b3ea3e1e17c105509da6\": container with ID starting with 9403133054e2da93cb98036c315f63591ebe925fe466b3ea3e1e17c105509da6 not found: ID does not exist" containerID="9403133054e2da93cb98036c315f63591ebe925fe466b3ea3e1e17c105509da6" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.715724 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9403133054e2da93cb98036c315f63591ebe925fe466b3ea3e1e17c105509da6"} err="failed to get container status \"9403133054e2da93cb98036c315f63591ebe925fe466b3ea3e1e17c105509da6\": rpc error: code = NotFound desc = could not find container \"9403133054e2da93cb98036c315f63591ebe925fe466b3ea3e1e17c105509da6\": container with ID starting with 9403133054e2da93cb98036c315f63591ebe925fe466b3ea3e1e17c105509da6 not found: ID does not exist" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.715830 4923 scope.go:117] "RemoveContainer" containerID="19a51a503fc4dd3138f85e59dbeff5754a7f0f1b4b122bbbcee3b761f1598753" Feb 24 03:16:20 crc kubenswrapper[4923]: E0224 03:16:20.731324 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19a51a503fc4dd3138f85e59dbeff5754a7f0f1b4b122bbbcee3b761f1598753\": container with ID starting with 19a51a503fc4dd3138f85e59dbeff5754a7f0f1b4b122bbbcee3b761f1598753 not found: ID does not exist" containerID="19a51a503fc4dd3138f85e59dbeff5754a7f0f1b4b122bbbcee3b761f1598753" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.731373 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19a51a503fc4dd3138f85e59dbeff5754a7f0f1b4b122bbbcee3b761f1598753"} err="failed to get container status \"19a51a503fc4dd3138f85e59dbeff5754a7f0f1b4b122bbbcee3b761f1598753\": rpc error: code = NotFound desc = could not find container \"19a51a503fc4dd3138f85e59dbeff5754a7f0f1b4b122bbbcee3b761f1598753\": container with ID starting with 19a51a503fc4dd3138f85e59dbeff5754a7f0f1b4b122bbbcee3b761f1598753 not found: ID does not exist" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.743308 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.788574 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-config-data\") pod \"nova-api-0\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.788630 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-public-tls-certs\") pod \"nova-api-0\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.788686 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.788765 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.788820 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4qnb\" (UniqueName: \"kubernetes.io/projected/4d90909e-92bd-4796-98c0-21f2f1b51e04-kube-api-access-b4qnb\") pod \"nova-api-0\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.788847 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d90909e-92bd-4796-98c0-21f2f1b51e04-logs\") pod \"nova-api-0\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.890746 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.890836 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.890874 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4qnb\" (UniqueName: \"kubernetes.io/projected/4d90909e-92bd-4796-98c0-21f2f1b51e04-kube-api-access-b4qnb\") pod \"nova-api-0\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.890898 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d90909e-92bd-4796-98c0-21f2f1b51e04-logs\") pod \"nova-api-0\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.890926 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-config-data\") pod \"nova-api-0\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.890953 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-public-tls-certs\") pod \"nova-api-0\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.892513 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d90909e-92bd-4796-98c0-21f2f1b51e04-logs\") pod \"nova-api-0\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.896563 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-config-data\") pod \"nova-api-0\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.903820 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.905012 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.905504 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-public-tls-certs\") pod \"nova-api-0\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " pod="openstack/nova-api-0" Feb 24 03:16:20 crc kubenswrapper[4923]: I0224 03:16:20.925917 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4qnb\" (UniqueName: \"kubernetes.io/projected/4d90909e-92bd-4796-98c0-21f2f1b51e04-kube-api-access-b4qnb\") pod \"nova-api-0\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " pod="openstack/nova-api-0" Feb 24 03:16:21 crc kubenswrapper[4923]: I0224 03:16:21.039843 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 24 03:16:21 crc kubenswrapper[4923]: I0224 03:16:21.138366 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:21 crc kubenswrapper[4923]: I0224 03:16:21.157031 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:21 crc kubenswrapper[4923]: W0224 03:16:21.559232 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d90909e_92bd_4796_98c0_21f2f1b51e04.slice/crio-715f31da8718e7c0c85b093adbc2a91e0fe53cb60cb23b4eaee77f50a35b2dc9 WatchSource:0}: Error finding container 715f31da8718e7c0c85b093adbc2a91e0fe53cb60cb23b4eaee77f50a35b2dc9: Status 404 returned error can't find the container with id 715f31da8718e7c0c85b093adbc2a91e0fe53cb60cb23b4eaee77f50a35b2dc9 Feb 24 03:16:21 crc kubenswrapper[4923]: I0224 03:16:21.559860 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 24 03:16:21 crc kubenswrapper[4923]: I0224 03:16:21.631670 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d90909e-92bd-4796-98c0-21f2f1b51e04","Type":"ContainerStarted","Data":"715f31da8718e7c0c85b093adbc2a91e0fe53cb60cb23b4eaee77f50a35b2dc9"} Feb 24 03:16:21 crc kubenswrapper[4923]: I0224 03:16:21.667093 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 24 03:16:21 crc kubenswrapper[4923]: I0224 03:16:21.739918 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8866591-1753-454f-b225-485f9695f0a6" path="/var/lib/kubelet/pods/c8866591-1753-454f-b225-485f9695f0a6/volumes" Feb 24 03:16:21 crc kubenswrapper[4923]: I0224 03:16:21.897433 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mvsg9"] Feb 24 03:16:21 crc kubenswrapper[4923]: I0224 03:16:21.898699 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mvsg9" Feb 24 03:16:21 crc kubenswrapper[4923]: I0224 03:16:21.904222 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 24 03:16:21 crc kubenswrapper[4923]: I0224 03:16:21.904328 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 24 03:16:21 crc kubenswrapper[4923]: I0224 03:16:21.906729 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mvsg9"] Feb 24 03:16:22 crc kubenswrapper[4923]: I0224 03:16:22.031267 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9124ee53-9cb0-4817-b967-d22e84935a4d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mvsg9\" (UID: \"9124ee53-9cb0-4817-b967-d22e84935a4d\") " pod="openstack/nova-cell1-cell-mapping-mvsg9" Feb 24 03:16:22 crc kubenswrapper[4923]: I0224 03:16:22.031367 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9124ee53-9cb0-4817-b967-d22e84935a4d-scripts\") pod \"nova-cell1-cell-mapping-mvsg9\" (UID: \"9124ee53-9cb0-4817-b967-d22e84935a4d\") " pod="openstack/nova-cell1-cell-mapping-mvsg9" Feb 24 03:16:22 crc kubenswrapper[4923]: I0224 03:16:22.031398 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9124ee53-9cb0-4817-b967-d22e84935a4d-config-data\") pod \"nova-cell1-cell-mapping-mvsg9\" (UID: \"9124ee53-9cb0-4817-b967-d22e84935a4d\") " pod="openstack/nova-cell1-cell-mapping-mvsg9" Feb 24 03:16:22 crc kubenswrapper[4923]: I0224 03:16:22.031477 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g24vc\" (UniqueName: \"kubernetes.io/projected/9124ee53-9cb0-4817-b967-d22e84935a4d-kube-api-access-g24vc\") pod \"nova-cell1-cell-mapping-mvsg9\" (UID: \"9124ee53-9cb0-4817-b967-d22e84935a4d\") " pod="openstack/nova-cell1-cell-mapping-mvsg9" Feb 24 03:16:22 crc kubenswrapper[4923]: I0224 03:16:22.133020 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9124ee53-9cb0-4817-b967-d22e84935a4d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mvsg9\" (UID: \"9124ee53-9cb0-4817-b967-d22e84935a4d\") " pod="openstack/nova-cell1-cell-mapping-mvsg9" Feb 24 03:16:22 crc kubenswrapper[4923]: I0224 03:16:22.133398 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9124ee53-9cb0-4817-b967-d22e84935a4d-scripts\") pod \"nova-cell1-cell-mapping-mvsg9\" (UID: \"9124ee53-9cb0-4817-b967-d22e84935a4d\") " pod="openstack/nova-cell1-cell-mapping-mvsg9" Feb 24 03:16:22 crc kubenswrapper[4923]: I0224 03:16:22.133522 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9124ee53-9cb0-4817-b967-d22e84935a4d-config-data\") pod \"nova-cell1-cell-mapping-mvsg9\" (UID: \"9124ee53-9cb0-4817-b967-d22e84935a4d\") " pod="openstack/nova-cell1-cell-mapping-mvsg9" Feb 24 03:16:22 crc kubenswrapper[4923]: I0224 03:16:22.133680 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g24vc\" (UniqueName: \"kubernetes.io/projected/9124ee53-9cb0-4817-b967-d22e84935a4d-kube-api-access-g24vc\") pod \"nova-cell1-cell-mapping-mvsg9\" (UID: \"9124ee53-9cb0-4817-b967-d22e84935a4d\") " pod="openstack/nova-cell1-cell-mapping-mvsg9" Feb 24 03:16:22 crc kubenswrapper[4923]: I0224 03:16:22.141216 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9124ee53-9cb0-4817-b967-d22e84935a4d-scripts\") pod \"nova-cell1-cell-mapping-mvsg9\" (UID: \"9124ee53-9cb0-4817-b967-d22e84935a4d\") " pod="openstack/nova-cell1-cell-mapping-mvsg9" Feb 24 03:16:22 crc kubenswrapper[4923]: I0224 03:16:22.141478 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9124ee53-9cb0-4817-b967-d22e84935a4d-config-data\") pod \"nova-cell1-cell-mapping-mvsg9\" (UID: \"9124ee53-9cb0-4817-b967-d22e84935a4d\") " pod="openstack/nova-cell1-cell-mapping-mvsg9" Feb 24 03:16:22 crc kubenswrapper[4923]: I0224 03:16:22.141696 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9124ee53-9cb0-4817-b967-d22e84935a4d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mvsg9\" (UID: \"9124ee53-9cb0-4817-b967-d22e84935a4d\") " pod="openstack/nova-cell1-cell-mapping-mvsg9" Feb 24 03:16:22 crc kubenswrapper[4923]: I0224 03:16:22.154952 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g24vc\" (UniqueName: \"kubernetes.io/projected/9124ee53-9cb0-4817-b967-d22e84935a4d-kube-api-access-g24vc\") pod \"nova-cell1-cell-mapping-mvsg9\" (UID: \"9124ee53-9cb0-4817-b967-d22e84935a4d\") " pod="openstack/nova-cell1-cell-mapping-mvsg9" Feb 24 03:16:22 crc kubenswrapper[4923]: I0224 03:16:22.283714 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mvsg9" Feb 24 03:16:22 crc kubenswrapper[4923]: I0224 03:16:22.652450 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d90909e-92bd-4796-98c0-21f2f1b51e04","Type":"ContainerStarted","Data":"d47623b9f5f03f3f98a83f12b22cf1123e01dc80b1582ce28ff31bfeb489092c"} Feb 24 03:16:22 crc kubenswrapper[4923]: I0224 03:16:22.652819 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d90909e-92bd-4796-98c0-21f2f1b51e04","Type":"ContainerStarted","Data":"a1381cd82e9164a9c20fb37eb4487b3d26b7967fd405011ea028b8825be7a2c1"} Feb 24 03:16:22 crc kubenswrapper[4923]: I0224 03:16:22.686485 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.686440169 podStartE2EDuration="2.686440169s" podCreationTimestamp="2026-02-24 03:16:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:16:22.670092678 +0000 UTC m=+1306.687163491" watchObservedRunningTime="2026-02-24 03:16:22.686440169 +0000 UTC m=+1306.703510992" Feb 24 03:16:22 crc kubenswrapper[4923]: I0224 03:16:22.754456 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mvsg9"] Feb 24 03:16:23 crc kubenswrapper[4923]: I0224 03:16:23.664870 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mvsg9" event={"ID":"9124ee53-9cb0-4817-b967-d22e84935a4d","Type":"ContainerStarted","Data":"710b2ca33276dd57b06f5554cab60da196cbafa53c2fcab67057a10b0d5b6ec4"} Feb 24 03:16:23 crc kubenswrapper[4923]: I0224 03:16:23.665208 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mvsg9" event={"ID":"9124ee53-9cb0-4817-b967-d22e84935a4d","Type":"ContainerStarted","Data":"5bf8b901c7b76c831a4e1ab15f57db4de197456c83fab773ea2e5b15b39fe3b0"} Feb 24 03:16:23 crc kubenswrapper[4923]: I0224 03:16:23.673961 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerName="ceilometer-central-agent" containerID="cri-o://313a8eff2b254e4ecb5a45bf7b7604b2961a7f818e1cbbb39ffacdd7870f91e6" gracePeriod=30 Feb 24 03:16:23 crc kubenswrapper[4923]: I0224 03:16:23.674028 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerName="proxy-httpd" containerID="cri-o://fc6b180f56cf78117636c016bf50c7af57ca6d5499535030d2b49487a60ab570" gracePeriod=30 Feb 24 03:16:23 crc kubenswrapper[4923]: I0224 03:16:23.674042 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerName="sg-core" containerID="cri-o://d3a50774d794867957f0d42d90af148d477d95804ece44adcc1f09e19dc86bdd" gracePeriod=30 Feb 24 03:16:23 crc kubenswrapper[4923]: I0224 03:16:23.673973 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51492e23-21d6-46a9-a8e6-2e7cf4063c0a","Type":"ContainerStarted","Data":"fc6b180f56cf78117636c016bf50c7af57ca6d5499535030d2b49487a60ab570"} Feb 24 03:16:23 crc kubenswrapper[4923]: I0224 03:16:23.674136 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 24 03:16:23 crc kubenswrapper[4923]: I0224 03:16:23.674288 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerName="ceilometer-notification-agent" containerID="cri-o://fdbea0ca2a8c7ed41bd9ce9c25f1c6193b5df559162cdc83e9205a51a22b0e81" gracePeriod=30 Feb 24 03:16:23 crc kubenswrapper[4923]: I0224 03:16:23.687318 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mvsg9" podStartSLOduration=2.687279779 podStartE2EDuration="2.687279779s" podCreationTimestamp="2026-02-24 03:16:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:16:23.67802031 +0000 UTC m=+1307.695091133" watchObservedRunningTime="2026-02-24 03:16:23.687279779 +0000 UTC m=+1307.704350592" Feb 24 03:16:23 crc kubenswrapper[4923]: I0224 03:16:23.715948 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.432858938 podStartE2EDuration="6.715931597s" podCreationTimestamp="2026-02-24 03:16:17 +0000 UTC" firstStartedPulling="2026-02-24 03:16:18.404357065 +0000 UTC m=+1302.421427878" lastFinishedPulling="2026-02-24 03:16:22.687429724 +0000 UTC m=+1306.704500537" observedRunningTime="2026-02-24 03:16:23.70710805 +0000 UTC m=+1307.724178863" watchObservedRunningTime="2026-02-24 03:16:23.715931597 +0000 UTC m=+1307.733002410" Feb 24 03:16:24 crc kubenswrapper[4923]: I0224 03:16:24.014692 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:16:24 crc kubenswrapper[4923]: I0224 03:16:24.094116 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-9jnnr"] Feb 24 03:16:24 crc kubenswrapper[4923]: I0224 03:16:24.094649 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" podUID="e95191de-44e3-4252-9fcf-cf159a81d3c8" containerName="dnsmasq-dns" containerID="cri-o://5b28b64c4e316f96be21ee2a5821927ed644ecea46ab8f6ce7b0b5fe3c23bb2b" gracePeriod=10 Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.385891 4923 generic.go:334] "Generic (PLEG): container finished" podID="e95191de-44e3-4252-9fcf-cf159a81d3c8" containerID="5b28b64c4e316f96be21ee2a5821927ed644ecea46ab8f6ce7b0b5fe3c23bb2b" exitCode=0 Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.385940 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" event={"ID":"e95191de-44e3-4252-9fcf-cf159a81d3c8","Type":"ContainerDied","Data":"5b28b64c4e316f96be21ee2a5821927ed644ecea46ab8f6ce7b0b5fe3c23bb2b"} Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.390967 4923 generic.go:334] "Generic (PLEG): container finished" podID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerID="fc6b180f56cf78117636c016bf50c7af57ca6d5499535030d2b49487a60ab570" exitCode=0 Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.391009 4923 generic.go:334] "Generic (PLEG): container finished" podID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerID="d3a50774d794867957f0d42d90af148d477d95804ece44adcc1f09e19dc86bdd" exitCode=2 Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.391021 4923 generic.go:334] "Generic (PLEG): container finished" podID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerID="fdbea0ca2a8c7ed41bd9ce9c25f1c6193b5df559162cdc83e9205a51a22b0e81" exitCode=0 Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.391030 4923 generic.go:334] "Generic (PLEG): container finished" podID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerID="313a8eff2b254e4ecb5a45bf7b7604b2961a7f818e1cbbb39ffacdd7870f91e6" exitCode=0 Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.391615 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51492e23-21d6-46a9-a8e6-2e7cf4063c0a","Type":"ContainerDied","Data":"fc6b180f56cf78117636c016bf50c7af57ca6d5499535030d2b49487a60ab570"} Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.391775 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51492e23-21d6-46a9-a8e6-2e7cf4063c0a","Type":"ContainerDied","Data":"d3a50774d794867957f0d42d90af148d477d95804ece44adcc1f09e19dc86bdd"} Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.391909 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51492e23-21d6-46a9-a8e6-2e7cf4063c0a","Type":"ContainerDied","Data":"fdbea0ca2a8c7ed41bd9ce9c25f1c6193b5df559162cdc83e9205a51a22b0e81"} Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.392047 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51492e23-21d6-46a9-a8e6-2e7cf4063c0a","Type":"ContainerDied","Data":"313a8eff2b254e4ecb5a45bf7b7604b2961a7f818e1cbbb39ffacdd7870f91e6"} Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.556628 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.601485 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-config\") pod \"e95191de-44e3-4252-9fcf-cf159a81d3c8\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.601729 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-ovsdbserver-sb\") pod \"e95191de-44e3-4252-9fcf-cf159a81d3c8\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.601874 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-ovsdbserver-nb\") pod \"e95191de-44e3-4252-9fcf-cf159a81d3c8\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.601951 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-dns-swift-storage-0\") pod \"e95191de-44e3-4252-9fcf-cf159a81d3c8\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.602072 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-dns-svc\") pod \"e95191de-44e3-4252-9fcf-cf159a81d3c8\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.602191 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5npc\" (UniqueName: \"kubernetes.io/projected/e95191de-44e3-4252-9fcf-cf159a81d3c8-kube-api-access-l5npc\") pod \"e95191de-44e3-4252-9fcf-cf159a81d3c8\" (UID: \"e95191de-44e3-4252-9fcf-cf159a81d3c8\") " Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.607728 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e95191de-44e3-4252-9fcf-cf159a81d3c8-kube-api-access-l5npc" (OuterVolumeSpecName: "kube-api-access-l5npc") pod "e95191de-44e3-4252-9fcf-cf159a81d3c8" (UID: "e95191de-44e3-4252-9fcf-cf159a81d3c8"). InnerVolumeSpecName "kube-api-access-l5npc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.672373 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e95191de-44e3-4252-9fcf-cf159a81d3c8" (UID: "e95191de-44e3-4252-9fcf-cf159a81d3c8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.679505 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e95191de-44e3-4252-9fcf-cf159a81d3c8" (UID: "e95191de-44e3-4252-9fcf-cf159a81d3c8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.680762 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e95191de-44e3-4252-9fcf-cf159a81d3c8" (UID: "e95191de-44e3-4252-9fcf-cf159a81d3c8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.681536 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e95191de-44e3-4252-9fcf-cf159a81d3c8" (UID: "e95191de-44e3-4252-9fcf-cf159a81d3c8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.698637 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-config" (OuterVolumeSpecName: "config") pod "e95191de-44e3-4252-9fcf-cf159a81d3c8" (UID: "e95191de-44e3-4252-9fcf-cf159a81d3c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.704330 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.704355 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.704367 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.704376 4923 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.704385 4923 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e95191de-44e3-4252-9fcf-cf159a81d3c8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:25 crc kubenswrapper[4923]: I0224 03:16:25.704395 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5npc\" (UniqueName: \"kubernetes.io/projected/e95191de-44e3-4252-9fcf-cf159a81d3c8-kube-api-access-l5npc\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.251869 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.313233 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-ceilometer-tls-certs\") pod \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.313355 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzhc5\" (UniqueName: \"kubernetes.io/projected/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-kube-api-access-bzhc5\") pod \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.313480 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-scripts\") pod \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.313551 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-combined-ca-bundle\") pod \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.313628 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-config-data\") pod \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.313760 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-sg-core-conf-yaml\") pod \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.313845 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-log-httpd\") pod \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.313918 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-run-httpd\") pod \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\" (UID: \"51492e23-21d6-46a9-a8e6-2e7cf4063c0a\") " Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.315443 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "51492e23-21d6-46a9-a8e6-2e7cf4063c0a" (UID: "51492e23-21d6-46a9-a8e6-2e7cf4063c0a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.319491 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "51492e23-21d6-46a9-a8e6-2e7cf4063c0a" (UID: "51492e23-21d6-46a9-a8e6-2e7cf4063c0a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.319972 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-kube-api-access-bzhc5" (OuterVolumeSpecName: "kube-api-access-bzhc5") pod "51492e23-21d6-46a9-a8e6-2e7cf4063c0a" (UID: "51492e23-21d6-46a9-a8e6-2e7cf4063c0a"). InnerVolumeSpecName "kube-api-access-bzhc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.322541 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-scripts" (OuterVolumeSpecName: "scripts") pod "51492e23-21d6-46a9-a8e6-2e7cf4063c0a" (UID: "51492e23-21d6-46a9-a8e6-2e7cf4063c0a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.375527 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "51492e23-21d6-46a9-a8e6-2e7cf4063c0a" (UID: "51492e23-21d6-46a9-a8e6-2e7cf4063c0a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.380113 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "51492e23-21d6-46a9-a8e6-2e7cf4063c0a" (UID: "51492e23-21d6-46a9-a8e6-2e7cf4063c0a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.406474 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51492e23-21d6-46a9-a8e6-2e7cf4063c0a","Type":"ContainerDied","Data":"cdd405c4d69769045ba7c5c247706d9be527c218c0c970c84450c10c0500e267"} Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.406532 4923 scope.go:117] "RemoveContainer" containerID="fc6b180f56cf78117636c016bf50c7af57ca6d5499535030d2b49487a60ab570" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.406656 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.412053 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51492e23-21d6-46a9-a8e6-2e7cf4063c0a" (UID: "51492e23-21d6-46a9-a8e6-2e7cf4063c0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.412839 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" event={"ID":"e95191de-44e3-4252-9fcf-cf159a81d3c8","Type":"ContainerDied","Data":"5885fd7b7042c31cd6e77bb54453d5cec1077a75799e2c768c069b4e3bdfff55"} Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.412974 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-9jnnr" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.416605 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.416633 4923 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.416671 4923 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.416726 4923 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.416741 4923 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.416758 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzhc5\" (UniqueName: \"kubernetes.io/projected/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-kube-api-access-bzhc5\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.416774 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.446434 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-config-data" (OuterVolumeSpecName: "config-data") pod "51492e23-21d6-46a9-a8e6-2e7cf4063c0a" (UID: "51492e23-21d6-46a9-a8e6-2e7cf4063c0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.522693 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51492e23-21d6-46a9-a8e6-2e7cf4063c0a-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.539854 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-9jnnr"] Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.540684 4923 scope.go:117] "RemoveContainer" containerID="d3a50774d794867957f0d42d90af148d477d95804ece44adcc1f09e19dc86bdd" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.547129 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-9jnnr"] Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.563582 4923 scope.go:117] "RemoveContainer" containerID="fdbea0ca2a8c7ed41bd9ce9c25f1c6193b5df559162cdc83e9205a51a22b0e81" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.588796 4923 scope.go:117] "RemoveContainer" containerID="313a8eff2b254e4ecb5a45bf7b7604b2961a7f818e1cbbb39ffacdd7870f91e6" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.623377 4923 scope.go:117] "RemoveContainer" containerID="5b28b64c4e316f96be21ee2a5821927ed644ecea46ab8f6ce7b0b5fe3c23bb2b" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.649266 4923 scope.go:117] "RemoveContainer" containerID="9dc420c569360ec65f0cddca7a11509c70744e03c03377e7607febc12afecf44" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.744364 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.756751 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.766010 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:16:26 crc kubenswrapper[4923]: E0224 03:16:26.766432 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerName="ceilometer-notification-agent" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.766455 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerName="ceilometer-notification-agent" Feb 24 03:16:26 crc kubenswrapper[4923]: E0224 03:16:26.766468 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerName="sg-core" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.766477 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerName="sg-core" Feb 24 03:16:26 crc kubenswrapper[4923]: E0224 03:16:26.766500 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerName="proxy-httpd" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.766509 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerName="proxy-httpd" Feb 24 03:16:26 crc kubenswrapper[4923]: E0224 03:16:26.766532 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e95191de-44e3-4252-9fcf-cf159a81d3c8" containerName="init" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.766540 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="e95191de-44e3-4252-9fcf-cf159a81d3c8" containerName="init" Feb 24 03:16:26 crc kubenswrapper[4923]: E0224 03:16:26.766559 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerName="ceilometer-central-agent" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.766566 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerName="ceilometer-central-agent" Feb 24 03:16:26 crc kubenswrapper[4923]: E0224 03:16:26.766580 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e95191de-44e3-4252-9fcf-cf159a81d3c8" containerName="dnsmasq-dns" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.766587 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="e95191de-44e3-4252-9fcf-cf159a81d3c8" containerName="dnsmasq-dns" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.766798 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerName="ceilometer-notification-agent" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.766816 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerName="sg-core" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.766834 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerName="ceilometer-central-agent" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.766866 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" containerName="proxy-httpd" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.766886 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="e95191de-44e3-4252-9fcf-cf159a81d3c8" containerName="dnsmasq-dns" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.769604 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.774631 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.775120 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.775221 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.775272 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.833849 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-config-data\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.833947 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.834345 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.834631 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-scripts\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.834727 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-run-httpd\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.834830 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.834867 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dw2c\" (UniqueName: \"kubernetes.io/projected/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-kube-api-access-8dw2c\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.834918 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-log-httpd\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.938031 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-scripts\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.938082 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-run-httpd\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.938639 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-run-httpd\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.938115 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.938715 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dw2c\" (UniqueName: \"kubernetes.io/projected/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-kube-api-access-8dw2c\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.939291 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-log-httpd\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.939404 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-config-data\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.939439 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.939540 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.939775 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-log-httpd\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.945347 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-scripts\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.947698 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.955356 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.956130 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-config-data\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.956729 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:26 crc kubenswrapper[4923]: I0224 03:16:26.964737 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dw2c\" (UniqueName: \"kubernetes.io/projected/0af0866b-f6b6-45cb-9322-25fc22f6b6b4-kube-api-access-8dw2c\") pod \"ceilometer-0\" (UID: \"0af0866b-f6b6-45cb-9322-25fc22f6b6b4\") " pod="openstack/ceilometer-0" Feb 24 03:16:27 crc kubenswrapper[4923]: I0224 03:16:27.084935 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 24 03:16:27 crc kubenswrapper[4923]: I0224 03:16:27.568319 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 24 03:16:27 crc kubenswrapper[4923]: W0224 03:16:27.569458 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0af0866b_f6b6_45cb_9322_25fc22f6b6b4.slice/crio-c65d524c3f3e2b7cc564f2133786f2e74b3762a6a4b0c2169e5cc91eab260b55 WatchSource:0}: Error finding container c65d524c3f3e2b7cc564f2133786f2e74b3762a6a4b0c2169e5cc91eab260b55: Status 404 returned error can't find the container with id c65d524c3f3e2b7cc564f2133786f2e74b3762a6a4b0c2169e5cc91eab260b55 Feb 24 03:16:27 crc kubenswrapper[4923]: I0224 03:16:27.572033 4923 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 03:16:27 crc kubenswrapper[4923]: I0224 03:16:27.726904 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51492e23-21d6-46a9-a8e6-2e7cf4063c0a" path="/var/lib/kubelet/pods/51492e23-21d6-46a9-a8e6-2e7cf4063c0a/volumes" Feb 24 03:16:27 crc kubenswrapper[4923]: I0224 03:16:27.727817 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e95191de-44e3-4252-9fcf-cf159a81d3c8" path="/var/lib/kubelet/pods/e95191de-44e3-4252-9fcf-cf159a81d3c8/volumes" Feb 24 03:16:28 crc kubenswrapper[4923]: I0224 03:16:28.441615 4923 generic.go:334] "Generic (PLEG): container finished" podID="9124ee53-9cb0-4817-b967-d22e84935a4d" containerID="710b2ca33276dd57b06f5554cab60da196cbafa53c2fcab67057a10b0d5b6ec4" exitCode=0 Feb 24 03:16:28 crc kubenswrapper[4923]: I0224 03:16:28.441891 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mvsg9" event={"ID":"9124ee53-9cb0-4817-b967-d22e84935a4d","Type":"ContainerDied","Data":"710b2ca33276dd57b06f5554cab60da196cbafa53c2fcab67057a10b0d5b6ec4"} Feb 24 03:16:28 crc kubenswrapper[4923]: I0224 03:16:28.456607 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0af0866b-f6b6-45cb-9322-25fc22f6b6b4","Type":"ContainerStarted","Data":"e2f5d1cb6584c73c4b005bb24c782627a843084b28d5c844bd5319f858257bfc"} Feb 24 03:16:28 crc kubenswrapper[4923]: I0224 03:16:28.456661 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0af0866b-f6b6-45cb-9322-25fc22f6b6b4","Type":"ContainerStarted","Data":"c65d524c3f3e2b7cc564f2133786f2e74b3762a6a4b0c2169e5cc91eab260b55"} Feb 24 03:16:29 crc kubenswrapper[4923]: I0224 03:16:29.477932 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0af0866b-f6b6-45cb-9322-25fc22f6b6b4","Type":"ContainerStarted","Data":"6a66eba3058ccaed5772f17b6979105a78de3bf59bc2030a8fd0d74694089955"} Feb 24 03:16:29 crc kubenswrapper[4923]: I0224 03:16:29.933434 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mvsg9" Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.051414 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9124ee53-9cb0-4817-b967-d22e84935a4d-combined-ca-bundle\") pod \"9124ee53-9cb0-4817-b967-d22e84935a4d\" (UID: \"9124ee53-9cb0-4817-b967-d22e84935a4d\") " Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.051600 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9124ee53-9cb0-4817-b967-d22e84935a4d-config-data\") pod \"9124ee53-9cb0-4817-b967-d22e84935a4d\" (UID: \"9124ee53-9cb0-4817-b967-d22e84935a4d\") " Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.051654 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g24vc\" (UniqueName: \"kubernetes.io/projected/9124ee53-9cb0-4817-b967-d22e84935a4d-kube-api-access-g24vc\") pod \"9124ee53-9cb0-4817-b967-d22e84935a4d\" (UID: \"9124ee53-9cb0-4817-b967-d22e84935a4d\") " Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.051693 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9124ee53-9cb0-4817-b967-d22e84935a4d-scripts\") pod \"9124ee53-9cb0-4817-b967-d22e84935a4d\" (UID: \"9124ee53-9cb0-4817-b967-d22e84935a4d\") " Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.061590 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9124ee53-9cb0-4817-b967-d22e84935a4d-kube-api-access-g24vc" (OuterVolumeSpecName: "kube-api-access-g24vc") pod "9124ee53-9cb0-4817-b967-d22e84935a4d" (UID: "9124ee53-9cb0-4817-b967-d22e84935a4d"). InnerVolumeSpecName "kube-api-access-g24vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.068483 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9124ee53-9cb0-4817-b967-d22e84935a4d-scripts" (OuterVolumeSpecName: "scripts") pod "9124ee53-9cb0-4817-b967-d22e84935a4d" (UID: "9124ee53-9cb0-4817-b967-d22e84935a4d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.078535 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9124ee53-9cb0-4817-b967-d22e84935a4d-config-data" (OuterVolumeSpecName: "config-data") pod "9124ee53-9cb0-4817-b967-d22e84935a4d" (UID: "9124ee53-9cb0-4817-b967-d22e84935a4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.086875 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9124ee53-9cb0-4817-b967-d22e84935a4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9124ee53-9cb0-4817-b967-d22e84935a4d" (UID: "9124ee53-9cb0-4817-b967-d22e84935a4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.153856 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9124ee53-9cb0-4817-b967-d22e84935a4d-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.153900 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g24vc\" (UniqueName: \"kubernetes.io/projected/9124ee53-9cb0-4817-b967-d22e84935a4d-kube-api-access-g24vc\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.153917 4923 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9124ee53-9cb0-4817-b967-d22e84935a4d-scripts\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.153929 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9124ee53-9cb0-4817-b967-d22e84935a4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.488499 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mvsg9" event={"ID":"9124ee53-9cb0-4817-b967-d22e84935a4d","Type":"ContainerDied","Data":"5bf8b901c7b76c831a4e1ab15f57db4de197456c83fab773ea2e5b15b39fe3b0"} Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.488546 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bf8b901c7b76c831a4e1ab15f57db4de197456c83fab773ea2e5b15b39fe3b0" Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.488590 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mvsg9" Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.491463 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0af0866b-f6b6-45cb-9322-25fc22f6b6b4","Type":"ContainerStarted","Data":"ae9ac08a23353848582e36dbd4268f2729fdaefccf7a8be0f0228d49996d4e22"} Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.578220 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.578781 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4d90909e-92bd-4796-98c0-21f2f1b51e04" containerName="nova-api-log" containerID="cri-o://a1381cd82e9164a9c20fb37eb4487b3d26b7967fd405011ea028b8825be7a2c1" gracePeriod=30 Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.579218 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4d90909e-92bd-4796-98c0-21f2f1b51e04" containerName="nova-api-api" containerID="cri-o://d47623b9f5f03f3f98a83f12b22cf1123e01dc80b1582ce28ff31bfeb489092c" gracePeriod=30 Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.589490 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.589701 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="21bbf39b-7001-4721-a0d4-95d2f1d523e9" containerName="nova-scheduler-scheduler" containerID="cri-o://afe1bc40b79046935e2a3157e103afafcb0a59d861e2fc1a06f04294a34ac01a" gracePeriod=30 Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.648435 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.648718 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4a19b08f-d8dc-4bf6-b907-f349739f12b8" containerName="nova-metadata-log" containerID="cri-o://95c737e32791047aae4cc518655e3ee3ddf74fde33b2da6bc13c23fbc81f4a09" gracePeriod=30 Feb 24 03:16:30 crc kubenswrapper[4923]: I0224 03:16:30.648842 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4a19b08f-d8dc-4bf6-b907-f349739f12b8" containerName="nova-metadata-metadata" containerID="cri-o://441a37768ae1abf50c42d470e504c43c8faad7f31a0d75393d661d917899b6a7" gracePeriod=30 Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.135411 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.272237 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-public-tls-certs\") pod \"4d90909e-92bd-4796-98c0-21f2f1b51e04\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.272513 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d90909e-92bd-4796-98c0-21f2f1b51e04-logs\") pod \"4d90909e-92bd-4796-98c0-21f2f1b51e04\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.272532 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-combined-ca-bundle\") pod \"4d90909e-92bd-4796-98c0-21f2f1b51e04\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.272626 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-internal-tls-certs\") pod \"4d90909e-92bd-4796-98c0-21f2f1b51e04\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.272670 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4qnb\" (UniqueName: \"kubernetes.io/projected/4d90909e-92bd-4796-98c0-21f2f1b51e04-kube-api-access-b4qnb\") pod \"4d90909e-92bd-4796-98c0-21f2f1b51e04\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.273014 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d90909e-92bd-4796-98c0-21f2f1b51e04-logs" (OuterVolumeSpecName: "logs") pod "4d90909e-92bd-4796-98c0-21f2f1b51e04" (UID: "4d90909e-92bd-4796-98c0-21f2f1b51e04"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.273069 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-config-data\") pod \"4d90909e-92bd-4796-98c0-21f2f1b51e04\" (UID: \"4d90909e-92bd-4796-98c0-21f2f1b51e04\") " Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.273452 4923 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d90909e-92bd-4796-98c0-21f2f1b51e04-logs\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.280511 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d90909e-92bd-4796-98c0-21f2f1b51e04-kube-api-access-b4qnb" (OuterVolumeSpecName: "kube-api-access-b4qnb") pod "4d90909e-92bd-4796-98c0-21f2f1b51e04" (UID: "4d90909e-92bd-4796-98c0-21f2f1b51e04"). InnerVolumeSpecName "kube-api-access-b4qnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.304196 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d90909e-92bd-4796-98c0-21f2f1b51e04" (UID: "4d90909e-92bd-4796-98c0-21f2f1b51e04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.317327 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-config-data" (OuterVolumeSpecName: "config-data") pod "4d90909e-92bd-4796-98c0-21f2f1b51e04" (UID: "4d90909e-92bd-4796-98c0-21f2f1b51e04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.328732 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4d90909e-92bd-4796-98c0-21f2f1b51e04" (UID: "4d90909e-92bd-4796-98c0-21f2f1b51e04"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.344008 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4d90909e-92bd-4796-98c0-21f2f1b51e04" (UID: "4d90909e-92bd-4796-98c0-21f2f1b51e04"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.375915 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.375958 4923 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.375975 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.375988 4923 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d90909e-92bd-4796-98c0-21f2f1b51e04-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.376000 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4qnb\" (UniqueName: \"kubernetes.io/projected/4d90909e-92bd-4796-98c0-21f2f1b51e04-kube-api-access-b4qnb\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.507524 4923 generic.go:334] "Generic (PLEG): container finished" podID="4d90909e-92bd-4796-98c0-21f2f1b51e04" containerID="d47623b9f5f03f3f98a83f12b22cf1123e01dc80b1582ce28ff31bfeb489092c" exitCode=0 Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.507556 4923 generic.go:334] "Generic (PLEG): container finished" podID="4d90909e-92bd-4796-98c0-21f2f1b51e04" containerID="a1381cd82e9164a9c20fb37eb4487b3d26b7967fd405011ea028b8825be7a2c1" exitCode=143 Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.507593 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d90909e-92bd-4796-98c0-21f2f1b51e04","Type":"ContainerDied","Data":"d47623b9f5f03f3f98a83f12b22cf1123e01dc80b1582ce28ff31bfeb489092c"} Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.507622 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d90909e-92bd-4796-98c0-21f2f1b51e04","Type":"ContainerDied","Data":"a1381cd82e9164a9c20fb37eb4487b3d26b7967fd405011ea028b8825be7a2c1"} Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.507632 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4d90909e-92bd-4796-98c0-21f2f1b51e04","Type":"ContainerDied","Data":"715f31da8718e7c0c85b093adbc2a91e0fe53cb60cb23b4eaee77f50a35b2dc9"} Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.507648 4923 scope.go:117] "RemoveContainer" containerID="d47623b9f5f03f3f98a83f12b22cf1123e01dc80b1582ce28ff31bfeb489092c" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.507789 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.525624 4923 generic.go:334] "Generic (PLEG): container finished" podID="21bbf39b-7001-4721-a0d4-95d2f1d523e9" containerID="afe1bc40b79046935e2a3157e103afafcb0a59d861e2fc1a06f04294a34ac01a" exitCode=0 Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.525709 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"21bbf39b-7001-4721-a0d4-95d2f1d523e9","Type":"ContainerDied","Data":"afe1bc40b79046935e2a3157e103afafcb0a59d861e2fc1a06f04294a34ac01a"} Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.535030 4923 generic.go:334] "Generic (PLEG): container finished" podID="4a19b08f-d8dc-4bf6-b907-f349739f12b8" containerID="95c737e32791047aae4cc518655e3ee3ddf74fde33b2da6bc13c23fbc81f4a09" exitCode=143 Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.535081 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a19b08f-d8dc-4bf6-b907-f349739f12b8","Type":"ContainerDied","Data":"95c737e32791047aae4cc518655e3ee3ddf74fde33b2da6bc13c23fbc81f4a09"} Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.552345 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.587442 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.595393 4923 scope.go:117] "RemoveContainer" containerID="a1381cd82e9164a9c20fb37eb4487b3d26b7967fd405011ea028b8825be7a2c1" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.599909 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 24 03:16:31 crc kubenswrapper[4923]: E0224 03:16:31.601040 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d90909e-92bd-4796-98c0-21f2f1b51e04" containerName="nova-api-log" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.601058 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d90909e-92bd-4796-98c0-21f2f1b51e04" containerName="nova-api-log" Feb 24 03:16:31 crc kubenswrapper[4923]: E0224 03:16:31.601071 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9124ee53-9cb0-4817-b967-d22e84935a4d" containerName="nova-manage" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.601077 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="9124ee53-9cb0-4817-b967-d22e84935a4d" containerName="nova-manage" Feb 24 03:16:31 crc kubenswrapper[4923]: E0224 03:16:31.601141 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d90909e-92bd-4796-98c0-21f2f1b51e04" containerName="nova-api-api" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.601200 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d90909e-92bd-4796-98c0-21f2f1b51e04" containerName="nova-api-api" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.601536 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d90909e-92bd-4796-98c0-21f2f1b51e04" containerName="nova-api-log" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.601559 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d90909e-92bd-4796-98c0-21f2f1b51e04" containerName="nova-api-api" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.601576 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="9124ee53-9cb0-4817-b967-d22e84935a4d" containerName="nova-manage" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.602598 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.605076 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.605222 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.605412 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.619207 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.636404 4923 scope.go:117] "RemoveContainer" containerID="d47623b9f5f03f3f98a83f12b22cf1123e01dc80b1582ce28ff31bfeb489092c" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.640569 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 24 03:16:31 crc kubenswrapper[4923]: E0224 03:16:31.640848 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d47623b9f5f03f3f98a83f12b22cf1123e01dc80b1582ce28ff31bfeb489092c\": container with ID starting with d47623b9f5f03f3f98a83f12b22cf1123e01dc80b1582ce28ff31bfeb489092c not found: ID does not exist" containerID="d47623b9f5f03f3f98a83f12b22cf1123e01dc80b1582ce28ff31bfeb489092c" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.640876 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d47623b9f5f03f3f98a83f12b22cf1123e01dc80b1582ce28ff31bfeb489092c"} err="failed to get container status \"d47623b9f5f03f3f98a83f12b22cf1123e01dc80b1582ce28ff31bfeb489092c\": rpc error: code = NotFound desc = could not find container \"d47623b9f5f03f3f98a83f12b22cf1123e01dc80b1582ce28ff31bfeb489092c\": container with ID starting with d47623b9f5f03f3f98a83f12b22cf1123e01dc80b1582ce28ff31bfeb489092c not found: ID does not exist" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.640896 4923 scope.go:117] "RemoveContainer" containerID="a1381cd82e9164a9c20fb37eb4487b3d26b7967fd405011ea028b8825be7a2c1" Feb 24 03:16:31 crc kubenswrapper[4923]: E0224 03:16:31.643505 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1381cd82e9164a9c20fb37eb4487b3d26b7967fd405011ea028b8825be7a2c1\": container with ID starting with a1381cd82e9164a9c20fb37eb4487b3d26b7967fd405011ea028b8825be7a2c1 not found: ID does not exist" containerID="a1381cd82e9164a9c20fb37eb4487b3d26b7967fd405011ea028b8825be7a2c1" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.643540 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1381cd82e9164a9c20fb37eb4487b3d26b7967fd405011ea028b8825be7a2c1"} err="failed to get container status \"a1381cd82e9164a9c20fb37eb4487b3d26b7967fd405011ea028b8825be7a2c1\": rpc error: code = NotFound desc = could not find container \"a1381cd82e9164a9c20fb37eb4487b3d26b7967fd405011ea028b8825be7a2c1\": container with ID starting with a1381cd82e9164a9c20fb37eb4487b3d26b7967fd405011ea028b8825be7a2c1 not found: ID does not exist" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.643557 4923 scope.go:117] "RemoveContainer" containerID="d47623b9f5f03f3f98a83f12b22cf1123e01dc80b1582ce28ff31bfeb489092c" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.644406 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d47623b9f5f03f3f98a83f12b22cf1123e01dc80b1582ce28ff31bfeb489092c"} err="failed to get container status \"d47623b9f5f03f3f98a83f12b22cf1123e01dc80b1582ce28ff31bfeb489092c\": rpc error: code = NotFound desc = could not find container \"d47623b9f5f03f3f98a83f12b22cf1123e01dc80b1582ce28ff31bfeb489092c\": container with ID starting with d47623b9f5f03f3f98a83f12b22cf1123e01dc80b1582ce28ff31bfeb489092c not found: ID does not exist" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.644425 4923 scope.go:117] "RemoveContainer" containerID="a1381cd82e9164a9c20fb37eb4487b3d26b7967fd405011ea028b8825be7a2c1" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.648667 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1381cd82e9164a9c20fb37eb4487b3d26b7967fd405011ea028b8825be7a2c1"} err="failed to get container status \"a1381cd82e9164a9c20fb37eb4487b3d26b7967fd405011ea028b8825be7a2c1\": rpc error: code = NotFound desc = could not find container \"a1381cd82e9164a9c20fb37eb4487b3d26b7967fd405011ea028b8825be7a2c1\": container with ID starting with a1381cd82e9164a9c20fb37eb4487b3d26b7967fd405011ea028b8825be7a2c1 not found: ID does not exist" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.740755 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d90909e-92bd-4796-98c0-21f2f1b51e04" path="/var/lib/kubelet/pods/4d90909e-92bd-4796-98c0-21f2f1b51e04/volumes" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.784859 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsmsm\" (UniqueName: \"kubernetes.io/projected/21bbf39b-7001-4721-a0d4-95d2f1d523e9-kube-api-access-nsmsm\") pod \"21bbf39b-7001-4721-a0d4-95d2f1d523e9\" (UID: \"21bbf39b-7001-4721-a0d4-95d2f1d523e9\") " Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.784994 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21bbf39b-7001-4721-a0d4-95d2f1d523e9-combined-ca-bundle\") pod \"21bbf39b-7001-4721-a0d4-95d2f1d523e9\" (UID: \"21bbf39b-7001-4721-a0d4-95d2f1d523e9\") " Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.785047 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21bbf39b-7001-4721-a0d4-95d2f1d523e9-config-data\") pod \"21bbf39b-7001-4721-a0d4-95d2f1d523e9\" (UID: \"21bbf39b-7001-4721-a0d4-95d2f1d523e9\") " Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.785501 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/633331a8-df46-4c85-b234-1e2820565794-logs\") pod \"nova-api-0\" (UID: \"633331a8-df46-4c85-b234-1e2820565794\") " pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.785690 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/633331a8-df46-4c85-b234-1e2820565794-config-data\") pod \"nova-api-0\" (UID: \"633331a8-df46-4c85-b234-1e2820565794\") " pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.785742 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/633331a8-df46-4c85-b234-1e2820565794-internal-tls-certs\") pod \"nova-api-0\" (UID: \"633331a8-df46-4c85-b234-1e2820565794\") " pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.785774 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/633331a8-df46-4c85-b234-1e2820565794-public-tls-certs\") pod \"nova-api-0\" (UID: \"633331a8-df46-4c85-b234-1e2820565794\") " pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.786093 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/633331a8-df46-4c85-b234-1e2820565794-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"633331a8-df46-4c85-b234-1e2820565794\") " pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.786171 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87cl4\" (UniqueName: \"kubernetes.io/projected/633331a8-df46-4c85-b234-1e2820565794-kube-api-access-87cl4\") pod \"nova-api-0\" (UID: \"633331a8-df46-4c85-b234-1e2820565794\") " pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.791468 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21bbf39b-7001-4721-a0d4-95d2f1d523e9-kube-api-access-nsmsm" (OuterVolumeSpecName: "kube-api-access-nsmsm") pod "21bbf39b-7001-4721-a0d4-95d2f1d523e9" (UID: "21bbf39b-7001-4721-a0d4-95d2f1d523e9"). InnerVolumeSpecName "kube-api-access-nsmsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.812402 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21bbf39b-7001-4721-a0d4-95d2f1d523e9-config-data" (OuterVolumeSpecName: "config-data") pod "21bbf39b-7001-4721-a0d4-95d2f1d523e9" (UID: "21bbf39b-7001-4721-a0d4-95d2f1d523e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.826272 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21bbf39b-7001-4721-a0d4-95d2f1d523e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21bbf39b-7001-4721-a0d4-95d2f1d523e9" (UID: "21bbf39b-7001-4721-a0d4-95d2f1d523e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.888122 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87cl4\" (UniqueName: \"kubernetes.io/projected/633331a8-df46-4c85-b234-1e2820565794-kube-api-access-87cl4\") pod \"nova-api-0\" (UID: \"633331a8-df46-4c85-b234-1e2820565794\") " pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.888218 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/633331a8-df46-4c85-b234-1e2820565794-logs\") pod \"nova-api-0\" (UID: \"633331a8-df46-4c85-b234-1e2820565794\") " pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.888253 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/633331a8-df46-4c85-b234-1e2820565794-config-data\") pod \"nova-api-0\" (UID: \"633331a8-df46-4c85-b234-1e2820565794\") " pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.888708 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/633331a8-df46-4c85-b234-1e2820565794-logs\") pod \"nova-api-0\" (UID: \"633331a8-df46-4c85-b234-1e2820565794\") " pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.888728 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/633331a8-df46-4c85-b234-1e2820565794-internal-tls-certs\") pod \"nova-api-0\" (UID: \"633331a8-df46-4c85-b234-1e2820565794\") " pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.888840 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/633331a8-df46-4c85-b234-1e2820565794-public-tls-certs\") pod \"nova-api-0\" (UID: \"633331a8-df46-4c85-b234-1e2820565794\") " pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.889071 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/633331a8-df46-4c85-b234-1e2820565794-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"633331a8-df46-4c85-b234-1e2820565794\") " pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.889246 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsmsm\" (UniqueName: \"kubernetes.io/projected/21bbf39b-7001-4721-a0d4-95d2f1d523e9-kube-api-access-nsmsm\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.889272 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21bbf39b-7001-4721-a0d4-95d2f1d523e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.889286 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21bbf39b-7001-4721-a0d4-95d2f1d523e9-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.891877 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/633331a8-df46-4c85-b234-1e2820565794-internal-tls-certs\") pod \"nova-api-0\" (UID: \"633331a8-df46-4c85-b234-1e2820565794\") " pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.891896 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/633331a8-df46-4c85-b234-1e2820565794-config-data\") pod \"nova-api-0\" (UID: \"633331a8-df46-4c85-b234-1e2820565794\") " pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.892322 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/633331a8-df46-4c85-b234-1e2820565794-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"633331a8-df46-4c85-b234-1e2820565794\") " pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.892871 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/633331a8-df46-4c85-b234-1e2820565794-public-tls-certs\") pod \"nova-api-0\" (UID: \"633331a8-df46-4c85-b234-1e2820565794\") " pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.905213 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87cl4\" (UniqueName: \"kubernetes.io/projected/633331a8-df46-4c85-b234-1e2820565794-kube-api-access-87cl4\") pod \"nova-api-0\" (UID: \"633331a8-df46-4c85-b234-1e2820565794\") " pod="openstack/nova-api-0" Feb 24 03:16:31 crc kubenswrapper[4923]: I0224 03:16:31.936448 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 24 03:16:32 crc kubenswrapper[4923]: W0224 03:16:32.394909 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod633331a8_df46_4c85_b234_1e2820565794.slice/crio-4eb28fd48ecdb630269620f7957322b2277b30726ff877d93752840773f3c94d WatchSource:0}: Error finding container 4eb28fd48ecdb630269620f7957322b2277b30726ff877d93752840773f3c94d: Status 404 returned error can't find the container with id 4eb28fd48ecdb630269620f7957322b2277b30726ff877d93752840773f3c94d Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.405642 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.552683 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"21bbf39b-7001-4721-a0d4-95d2f1d523e9","Type":"ContainerDied","Data":"39bfe6d87e7fe2135caaefc1fde9086729590d8a6a88398ecbdd71c63fe3ecad"} Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.552733 4923 scope.go:117] "RemoveContainer" containerID="afe1bc40b79046935e2a3157e103afafcb0a59d861e2fc1a06f04294a34ac01a" Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.552861 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.557720 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0af0866b-f6b6-45cb-9322-25fc22f6b6b4","Type":"ContainerStarted","Data":"da97d7c1373f06dd03c65a927d0a9e4e3e6aba4d95c2d0432f4d5f31036fd9d8"} Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.558681 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.560700 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"633331a8-df46-4c85-b234-1e2820565794","Type":"ContainerStarted","Data":"4eb28fd48ecdb630269620f7957322b2277b30726ff877d93752840773f3c94d"} Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.602323 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.728504367 podStartE2EDuration="6.602286943s" podCreationTimestamp="2026-02-24 03:16:26 +0000 UTC" firstStartedPulling="2026-02-24 03:16:27.571710509 +0000 UTC m=+1311.588781322" lastFinishedPulling="2026-02-24 03:16:31.445493085 +0000 UTC m=+1315.462563898" observedRunningTime="2026-02-24 03:16:32.584129046 +0000 UTC m=+1316.601199879" watchObservedRunningTime="2026-02-24 03:16:32.602286943 +0000 UTC m=+1316.619357756" Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.636759 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.645405 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.653438 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 24 03:16:32 crc kubenswrapper[4923]: E0224 03:16:32.653990 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21bbf39b-7001-4721-a0d4-95d2f1d523e9" containerName="nova-scheduler-scheduler" Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.654084 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="21bbf39b-7001-4721-a0d4-95d2f1d523e9" containerName="nova-scheduler-scheduler" Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.654348 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="21bbf39b-7001-4721-a0d4-95d2f1d523e9" containerName="nova-scheduler-scheduler" Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.654955 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.657498 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.662138 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.822015 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a8d4b38-c639-42b0-a48a-6193bee91648-config-data\") pod \"nova-scheduler-0\" (UID: \"5a8d4b38-c639-42b0-a48a-6193bee91648\") " pod="openstack/nova-scheduler-0" Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.822095 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v28g5\" (UniqueName: \"kubernetes.io/projected/5a8d4b38-c639-42b0-a48a-6193bee91648-kube-api-access-v28g5\") pod \"nova-scheduler-0\" (UID: \"5a8d4b38-c639-42b0-a48a-6193bee91648\") " pod="openstack/nova-scheduler-0" Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.822159 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a8d4b38-c639-42b0-a48a-6193bee91648-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5a8d4b38-c639-42b0-a48a-6193bee91648\") " pod="openstack/nova-scheduler-0" Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.923981 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a8d4b38-c639-42b0-a48a-6193bee91648-config-data\") pod \"nova-scheduler-0\" (UID: \"5a8d4b38-c639-42b0-a48a-6193bee91648\") " pod="openstack/nova-scheduler-0" Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.924053 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v28g5\" (UniqueName: \"kubernetes.io/projected/5a8d4b38-c639-42b0-a48a-6193bee91648-kube-api-access-v28g5\") pod \"nova-scheduler-0\" (UID: \"5a8d4b38-c639-42b0-a48a-6193bee91648\") " pod="openstack/nova-scheduler-0" Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.924108 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a8d4b38-c639-42b0-a48a-6193bee91648-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5a8d4b38-c639-42b0-a48a-6193bee91648\") " pod="openstack/nova-scheduler-0" Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.928024 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a8d4b38-c639-42b0-a48a-6193bee91648-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5a8d4b38-c639-42b0-a48a-6193bee91648\") " pod="openstack/nova-scheduler-0" Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.939800 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a8d4b38-c639-42b0-a48a-6193bee91648-config-data\") pod \"nova-scheduler-0\" (UID: \"5a8d4b38-c639-42b0-a48a-6193bee91648\") " pod="openstack/nova-scheduler-0" Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.947760 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v28g5\" (UniqueName: \"kubernetes.io/projected/5a8d4b38-c639-42b0-a48a-6193bee91648-kube-api-access-v28g5\") pod \"nova-scheduler-0\" (UID: \"5a8d4b38-c639-42b0-a48a-6193bee91648\") " pod="openstack/nova-scheduler-0" Feb 24 03:16:32 crc kubenswrapper[4923]: I0224 03:16:32.978389 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 24 03:16:33 crc kubenswrapper[4923]: I0224 03:16:33.458922 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 24 03:16:33 crc kubenswrapper[4923]: W0224 03:16:33.461191 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a8d4b38_c639_42b0_a48a_6193bee91648.slice/crio-264dda42a4ceceb66c214c3ed77e3a2bda7a7f6dcb1abea985b0b0f4de047a5c WatchSource:0}: Error finding container 264dda42a4ceceb66c214c3ed77e3a2bda7a7f6dcb1abea985b0b0f4de047a5c: Status 404 returned error can't find the container with id 264dda42a4ceceb66c214c3ed77e3a2bda7a7f6dcb1abea985b0b0f4de047a5c Feb 24 03:16:33 crc kubenswrapper[4923]: I0224 03:16:33.574605 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5a8d4b38-c639-42b0-a48a-6193bee91648","Type":"ContainerStarted","Data":"264dda42a4ceceb66c214c3ed77e3a2bda7a7f6dcb1abea985b0b0f4de047a5c"} Feb 24 03:16:33 crc kubenswrapper[4923]: I0224 03:16:33.578178 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"633331a8-df46-4c85-b234-1e2820565794","Type":"ContainerStarted","Data":"a93e43973123e32f6beb51b6d8177e0d0942de48f2235c9fa80530701608303c"} Feb 24 03:16:33 crc kubenswrapper[4923]: I0224 03:16:33.578237 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"633331a8-df46-4c85-b234-1e2820565794","Type":"ContainerStarted","Data":"10bc61182f6e6f8311022b3aadd22f3a83ae35dce099c5cfd10ac7e6720e657f"} Feb 24 03:16:33 crc kubenswrapper[4923]: I0224 03:16:33.725287 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21bbf39b-7001-4721-a0d4-95d2f1d523e9" path="/var/lib/kubelet/pods/21bbf39b-7001-4721-a0d4-95d2f1d523e9/volumes" Feb 24 03:16:33 crc kubenswrapper[4923]: I0224 03:16:33.779834 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4a19b08f-d8dc-4bf6-b907-f349739f12b8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": read tcp 10.217.0.2:56594->10.217.0.202:8775: read: connection reset by peer" Feb 24 03:16:33 crc kubenswrapper[4923]: I0224 03:16:33.780020 4923 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4a19b08f-d8dc-4bf6-b907-f349739f12b8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": read tcp 10.217.0.2:56610->10.217.0.202:8775: read: connection reset by peer" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.259394 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.282897 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.2828764440000002 podStartE2EDuration="3.282876444s" podCreationTimestamp="2026-02-24 03:16:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:16:33.606913262 +0000 UTC m=+1317.623984135" watchObservedRunningTime="2026-02-24 03:16:34.282876444 +0000 UTC m=+1318.299947257" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.356551 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a19b08f-d8dc-4bf6-b907-f349739f12b8-nova-metadata-tls-certs\") pod \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.356771 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a19b08f-d8dc-4bf6-b907-f349739f12b8-combined-ca-bundle\") pod \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.356830 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a19b08f-d8dc-4bf6-b907-f349739f12b8-config-data\") pod \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.356901 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a19b08f-d8dc-4bf6-b907-f349739f12b8-logs\") pod \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.356956 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hprzq\" (UniqueName: \"kubernetes.io/projected/4a19b08f-d8dc-4bf6-b907-f349739f12b8-kube-api-access-hprzq\") pod \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\" (UID: \"4a19b08f-d8dc-4bf6-b907-f349739f12b8\") " Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.357412 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a19b08f-d8dc-4bf6-b907-f349739f12b8-logs" (OuterVolumeSpecName: "logs") pod "4a19b08f-d8dc-4bf6-b907-f349739f12b8" (UID: "4a19b08f-d8dc-4bf6-b907-f349739f12b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.357527 4923 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a19b08f-d8dc-4bf6-b907-f349739f12b8-logs\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.366190 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a19b08f-d8dc-4bf6-b907-f349739f12b8-kube-api-access-hprzq" (OuterVolumeSpecName: "kube-api-access-hprzq") pod "4a19b08f-d8dc-4bf6-b907-f349739f12b8" (UID: "4a19b08f-d8dc-4bf6-b907-f349739f12b8"). InnerVolumeSpecName "kube-api-access-hprzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.398835 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a19b08f-d8dc-4bf6-b907-f349739f12b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a19b08f-d8dc-4bf6-b907-f349739f12b8" (UID: "4a19b08f-d8dc-4bf6-b907-f349739f12b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.401939 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a19b08f-d8dc-4bf6-b907-f349739f12b8-config-data" (OuterVolumeSpecName: "config-data") pod "4a19b08f-d8dc-4bf6-b907-f349739f12b8" (UID: "4a19b08f-d8dc-4bf6-b907-f349739f12b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.434636 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a19b08f-d8dc-4bf6-b907-f349739f12b8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4a19b08f-d8dc-4bf6-b907-f349739f12b8" (UID: "4a19b08f-d8dc-4bf6-b907-f349739f12b8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.458958 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a19b08f-d8dc-4bf6-b907-f349739f12b8-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.459194 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hprzq\" (UniqueName: \"kubernetes.io/projected/4a19b08f-d8dc-4bf6-b907-f349739f12b8-kube-api-access-hprzq\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.459352 4923 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a19b08f-d8dc-4bf6-b907-f349739f12b8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.459460 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a19b08f-d8dc-4bf6-b907-f349739f12b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.591191 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5a8d4b38-c639-42b0-a48a-6193bee91648","Type":"ContainerStarted","Data":"9644bb2fe5bc0322ee769c64795879dc0c6a8e6ab7ad04efd7814f10ddd85153"} Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.594240 4923 generic.go:334] "Generic (PLEG): container finished" podID="4a19b08f-d8dc-4bf6-b907-f349739f12b8" containerID="441a37768ae1abf50c42d470e504c43c8faad7f31a0d75393d661d917899b6a7" exitCode=0 Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.594332 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a19b08f-d8dc-4bf6-b907-f349739f12b8","Type":"ContainerDied","Data":"441a37768ae1abf50c42d470e504c43c8faad7f31a0d75393d661d917899b6a7"} Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.594379 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a19b08f-d8dc-4bf6-b907-f349739f12b8","Type":"ContainerDied","Data":"187c1fa63f1ea365f2deb87a195b6552ccb7915000c83246d7230f6c3f9b4d2f"} Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.594407 4923 scope.go:117] "RemoveContainer" containerID="441a37768ae1abf50c42d470e504c43c8faad7f31a0d75393d661d917899b6a7" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.594702 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.616352 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.616332703 podStartE2EDuration="2.616332703s" podCreationTimestamp="2026-02-24 03:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:16:34.607440644 +0000 UTC m=+1318.624511477" watchObservedRunningTime="2026-02-24 03:16:34.616332703 +0000 UTC m=+1318.633403516" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.622527 4923 scope.go:117] "RemoveContainer" containerID="95c737e32791047aae4cc518655e3ee3ddf74fde33b2da6bc13c23fbc81f4a09" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.644037 4923 scope.go:117] "RemoveContainer" containerID="441a37768ae1abf50c42d470e504c43c8faad7f31a0d75393d661d917899b6a7" Feb 24 03:16:34 crc kubenswrapper[4923]: E0224 03:16:34.644541 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"441a37768ae1abf50c42d470e504c43c8faad7f31a0d75393d661d917899b6a7\": container with ID starting with 441a37768ae1abf50c42d470e504c43c8faad7f31a0d75393d661d917899b6a7 not found: ID does not exist" containerID="441a37768ae1abf50c42d470e504c43c8faad7f31a0d75393d661d917899b6a7" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.644590 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"441a37768ae1abf50c42d470e504c43c8faad7f31a0d75393d661d917899b6a7"} err="failed to get container status \"441a37768ae1abf50c42d470e504c43c8faad7f31a0d75393d661d917899b6a7\": rpc error: code = NotFound desc = could not find container \"441a37768ae1abf50c42d470e504c43c8faad7f31a0d75393d661d917899b6a7\": container with ID starting with 441a37768ae1abf50c42d470e504c43c8faad7f31a0d75393d661d917899b6a7 not found: ID does not exist" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.644608 4923 scope.go:117] "RemoveContainer" containerID="95c737e32791047aae4cc518655e3ee3ddf74fde33b2da6bc13c23fbc81f4a09" Feb 24 03:16:34 crc kubenswrapper[4923]: E0224 03:16:34.645108 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95c737e32791047aae4cc518655e3ee3ddf74fde33b2da6bc13c23fbc81f4a09\": container with ID starting with 95c737e32791047aae4cc518655e3ee3ddf74fde33b2da6bc13c23fbc81f4a09 not found: ID does not exist" containerID="95c737e32791047aae4cc518655e3ee3ddf74fde33b2da6bc13c23fbc81f4a09" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.645150 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c737e32791047aae4cc518655e3ee3ddf74fde33b2da6bc13c23fbc81f4a09"} err="failed to get container status \"95c737e32791047aae4cc518655e3ee3ddf74fde33b2da6bc13c23fbc81f4a09\": rpc error: code = NotFound desc = could not find container \"95c737e32791047aae4cc518655e3ee3ddf74fde33b2da6bc13c23fbc81f4a09\": container with ID starting with 95c737e32791047aae4cc518655e3ee3ddf74fde33b2da6bc13c23fbc81f4a09 not found: ID does not exist" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.646164 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.655159 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.685395 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:16:34 crc kubenswrapper[4923]: E0224 03:16:34.686479 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a19b08f-d8dc-4bf6-b907-f349739f12b8" containerName="nova-metadata-metadata" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.686683 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a19b08f-d8dc-4bf6-b907-f349739f12b8" containerName="nova-metadata-metadata" Feb 24 03:16:34 crc kubenswrapper[4923]: E0224 03:16:34.686776 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a19b08f-d8dc-4bf6-b907-f349739f12b8" containerName="nova-metadata-log" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.686845 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a19b08f-d8dc-4bf6-b907-f349739f12b8" containerName="nova-metadata-log" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.687532 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a19b08f-d8dc-4bf6-b907-f349739f12b8" containerName="nova-metadata-log" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.687676 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a19b08f-d8dc-4bf6-b907-f349739f12b8" containerName="nova-metadata-metadata" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.690411 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.705210 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.705634 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.718220 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.907453 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f2c858b-ff6d-44cb-9925-d4c0ef27f133-logs\") pod \"nova-metadata-0\" (UID: \"9f2c858b-ff6d-44cb-9925-d4c0ef27f133\") " pod="openstack/nova-metadata-0" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.907854 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f2c858b-ff6d-44cb-9925-d4c0ef27f133-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9f2c858b-ff6d-44cb-9925-d4c0ef27f133\") " pod="openstack/nova-metadata-0" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.907890 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f2c858b-ff6d-44cb-9925-d4c0ef27f133-config-data\") pod \"nova-metadata-0\" (UID: \"9f2c858b-ff6d-44cb-9925-d4c0ef27f133\") " pod="openstack/nova-metadata-0" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.908613 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dqhc\" (UniqueName: \"kubernetes.io/projected/9f2c858b-ff6d-44cb-9925-d4c0ef27f133-kube-api-access-8dqhc\") pod \"nova-metadata-0\" (UID: \"9f2c858b-ff6d-44cb-9925-d4c0ef27f133\") " pod="openstack/nova-metadata-0" Feb 24 03:16:34 crc kubenswrapper[4923]: I0224 03:16:34.909620 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f2c858b-ff6d-44cb-9925-d4c0ef27f133-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f2c858b-ff6d-44cb-9925-d4c0ef27f133\") " pod="openstack/nova-metadata-0" Feb 24 03:16:35 crc kubenswrapper[4923]: I0224 03:16:35.010992 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f2c858b-ff6d-44cb-9925-d4c0ef27f133-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9f2c858b-ff6d-44cb-9925-d4c0ef27f133\") " pod="openstack/nova-metadata-0" Feb 24 03:16:35 crc kubenswrapper[4923]: I0224 03:16:35.011040 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f2c858b-ff6d-44cb-9925-d4c0ef27f133-config-data\") pod \"nova-metadata-0\" (UID: \"9f2c858b-ff6d-44cb-9925-d4c0ef27f133\") " pod="openstack/nova-metadata-0" Feb 24 03:16:35 crc kubenswrapper[4923]: I0224 03:16:35.011093 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dqhc\" (UniqueName: \"kubernetes.io/projected/9f2c858b-ff6d-44cb-9925-d4c0ef27f133-kube-api-access-8dqhc\") pod \"nova-metadata-0\" (UID: \"9f2c858b-ff6d-44cb-9925-d4c0ef27f133\") " pod="openstack/nova-metadata-0" Feb 24 03:16:35 crc kubenswrapper[4923]: I0224 03:16:35.011119 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f2c858b-ff6d-44cb-9925-d4c0ef27f133-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f2c858b-ff6d-44cb-9925-d4c0ef27f133\") " pod="openstack/nova-metadata-0" Feb 24 03:16:35 crc kubenswrapper[4923]: I0224 03:16:35.011178 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f2c858b-ff6d-44cb-9925-d4c0ef27f133-logs\") pod \"nova-metadata-0\" (UID: \"9f2c858b-ff6d-44cb-9925-d4c0ef27f133\") " pod="openstack/nova-metadata-0" Feb 24 03:16:35 crc kubenswrapper[4923]: I0224 03:16:35.011516 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f2c858b-ff6d-44cb-9925-d4c0ef27f133-logs\") pod \"nova-metadata-0\" (UID: \"9f2c858b-ff6d-44cb-9925-d4c0ef27f133\") " pod="openstack/nova-metadata-0" Feb 24 03:16:35 crc kubenswrapper[4923]: I0224 03:16:35.017204 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f2c858b-ff6d-44cb-9925-d4c0ef27f133-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f2c858b-ff6d-44cb-9925-d4c0ef27f133\") " pod="openstack/nova-metadata-0" Feb 24 03:16:35 crc kubenswrapper[4923]: I0224 03:16:35.017798 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f2c858b-ff6d-44cb-9925-d4c0ef27f133-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9f2c858b-ff6d-44cb-9925-d4c0ef27f133\") " pod="openstack/nova-metadata-0" Feb 24 03:16:35 crc kubenswrapper[4923]: I0224 03:16:35.026772 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f2c858b-ff6d-44cb-9925-d4c0ef27f133-config-data\") pod \"nova-metadata-0\" (UID: \"9f2c858b-ff6d-44cb-9925-d4c0ef27f133\") " pod="openstack/nova-metadata-0" Feb 24 03:16:35 crc kubenswrapper[4923]: I0224 03:16:35.027142 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dqhc\" (UniqueName: \"kubernetes.io/projected/9f2c858b-ff6d-44cb-9925-d4c0ef27f133-kube-api-access-8dqhc\") pod \"nova-metadata-0\" (UID: \"9f2c858b-ff6d-44cb-9925-d4c0ef27f133\") " pod="openstack/nova-metadata-0" Feb 24 03:16:35 crc kubenswrapper[4923]: I0224 03:16:35.030218 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 24 03:16:35 crc kubenswrapper[4923]: W0224 03:16:35.329414 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f2c858b_ff6d_44cb_9925_d4c0ef27f133.slice/crio-a1fa69ed753b969539368c99d84325b5d0960d1e3070e1f02e28aed72475e939 WatchSource:0}: Error finding container a1fa69ed753b969539368c99d84325b5d0960d1e3070e1f02e28aed72475e939: Status 404 returned error can't find the container with id a1fa69ed753b969539368c99d84325b5d0960d1e3070e1f02e28aed72475e939 Feb 24 03:16:35 crc kubenswrapper[4923]: I0224 03:16:35.332136 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 24 03:16:35 crc kubenswrapper[4923]: I0224 03:16:35.606937 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f2c858b-ff6d-44cb-9925-d4c0ef27f133","Type":"ContainerStarted","Data":"bebd4adc656fd32c66ee671e9193204071d84f8edd66b7998d50da8d4bad09f5"} Feb 24 03:16:35 crc kubenswrapper[4923]: I0224 03:16:35.606983 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f2c858b-ff6d-44cb-9925-d4c0ef27f133","Type":"ContainerStarted","Data":"a1fa69ed753b969539368c99d84325b5d0960d1e3070e1f02e28aed72475e939"} Feb 24 03:16:35 crc kubenswrapper[4923]: I0224 03:16:35.724088 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a19b08f-d8dc-4bf6-b907-f349739f12b8" path="/var/lib/kubelet/pods/4a19b08f-d8dc-4bf6-b907-f349739f12b8/volumes" Feb 24 03:16:36 crc kubenswrapper[4923]: I0224 03:16:36.623818 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f2c858b-ff6d-44cb-9925-d4c0ef27f133","Type":"ContainerStarted","Data":"866c31d1445be3366ede20247c205bb29b441014bd631f935eb7f524a53c6708"} Feb 24 03:16:37 crc kubenswrapper[4923]: I0224 03:16:37.978510 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 24 03:16:40 crc kubenswrapper[4923]: I0224 03:16:40.031644 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 24 03:16:40 crc kubenswrapper[4923]: I0224 03:16:40.032266 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 24 03:16:41 crc kubenswrapper[4923]: I0224 03:16:41.938009 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 24 03:16:41 crc kubenswrapper[4923]: I0224 03:16:41.938413 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 24 03:16:42 crc kubenswrapper[4923]: I0224 03:16:42.972502 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="633331a8-df46-4c85-b234-1e2820565794" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.211:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 24 03:16:42 crc kubenswrapper[4923]: I0224 03:16:42.972621 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="633331a8-df46-4c85-b234-1e2820565794" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.211:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 24 03:16:42 crc kubenswrapper[4923]: I0224 03:16:42.978558 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 24 03:16:43 crc kubenswrapper[4923]: I0224 03:16:43.025076 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 24 03:16:43 crc kubenswrapper[4923]: I0224 03:16:43.059524 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=9.059498173 podStartE2EDuration="9.059498173s" podCreationTimestamp="2026-02-24 03:16:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:16:36.647567617 +0000 UTC m=+1320.664638470" watchObservedRunningTime="2026-02-24 03:16:43.059498173 +0000 UTC m=+1327.076569016" Feb 24 03:16:43 crc kubenswrapper[4923]: I0224 03:16:43.747953 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 24 03:16:45 crc kubenswrapper[4923]: I0224 03:16:45.031531 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 24 03:16:45 crc kubenswrapper[4923]: I0224 03:16:45.031602 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 24 03:16:46 crc kubenswrapper[4923]: I0224 03:16:46.049479 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9f2c858b-ff6d-44cb-9925-d4c0ef27f133" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 24 03:16:46 crc kubenswrapper[4923]: I0224 03:16:46.049478 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9f2c858b-ff6d-44cb-9925-d4c0ef27f133" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 24 03:16:49 crc kubenswrapper[4923]: I0224 03:16:49.915973 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:16:49 crc kubenswrapper[4923]: I0224 03:16:49.916487 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:16:51 crc kubenswrapper[4923]: I0224 03:16:51.943029 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 24 03:16:51 crc kubenswrapper[4923]: I0224 03:16:51.943583 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 24 03:16:51 crc kubenswrapper[4923]: I0224 03:16:51.943603 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 24 03:16:51 crc kubenswrapper[4923]: I0224 03:16:51.951585 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 24 03:16:52 crc kubenswrapper[4923]: I0224 03:16:52.822716 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 24 03:16:52 crc kubenswrapper[4923]: I0224 03:16:52.868910 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 24 03:16:55 crc kubenswrapper[4923]: I0224 03:16:55.038359 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 24 03:16:55 crc kubenswrapper[4923]: I0224 03:16:55.045289 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 24 03:16:55 crc kubenswrapper[4923]: I0224 03:16:55.046256 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 24 03:16:55 crc kubenswrapper[4923]: I0224 03:16:55.866971 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 24 03:16:57 crc kubenswrapper[4923]: I0224 03:16:57.102379 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 24 03:17:06 crc kubenswrapper[4923]: I0224 03:17:06.587151 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 24 03:17:07 crc kubenswrapper[4923]: I0224 03:17:07.376779 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 24 03:17:10 crc kubenswrapper[4923]: I0224 03:17:10.633463 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="d9950d0b-d980-4e4f-82b4-9f616c6c50a3" containerName="rabbitmq" containerID="cri-o://713435589abcdf1b3efff239c9809bb5f70abfdf930ae064d9f58c2904554131" gracePeriod=604796 Feb 24 03:17:11 crc kubenswrapper[4923]: I0224 03:17:11.498753 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" containerName="rabbitmq" containerID="cri-o://f214badca1940336d886b38fbd14ffa06f3f3192e9d3d1cef2ec79f7bef5c6b0" gracePeriod=604796 Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.071919 4923 generic.go:334] "Generic (PLEG): container finished" podID="d9950d0b-d980-4e4f-82b4-9f616c6c50a3" containerID="713435589abcdf1b3efff239c9809bb5f70abfdf930ae064d9f58c2904554131" exitCode=0 Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.071991 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d9950d0b-d980-4e4f-82b4-9f616c6c50a3","Type":"ContainerDied","Data":"713435589abcdf1b3efff239c9809bb5f70abfdf930ae064d9f58c2904554131"} Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.309498 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.320735 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-tls\") pod \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.320870 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-config-data\") pod \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.321668 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-server-conf\") pod \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.321733 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-pod-info\") pod \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.321775 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-plugins-conf\") pod \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.321883 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mfkt\" (UniqueName: \"kubernetes.io/projected/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-kube-api-access-8mfkt\") pod \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.321981 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-erlang-cookie-secret\") pod \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.322016 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-erlang-cookie\") pod \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.322053 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-plugins\") pod \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.322089 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.322176 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-confd\") pod \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\" (UID: \"d9950d0b-d980-4e4f-82b4-9f616c6c50a3\") " Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.323238 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d9950d0b-d980-4e4f-82b4-9f616c6c50a3" (UID: "d9950d0b-d980-4e4f-82b4-9f616c6c50a3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.323836 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d9950d0b-d980-4e4f-82b4-9f616c6c50a3" (UID: "d9950d0b-d980-4e4f-82b4-9f616c6c50a3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.324060 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d9950d0b-d980-4e4f-82b4-9f616c6c50a3" (UID: "d9950d0b-d980-4e4f-82b4-9f616c6c50a3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.329327 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-kube-api-access-8mfkt" (OuterVolumeSpecName: "kube-api-access-8mfkt") pod "d9950d0b-d980-4e4f-82b4-9f616c6c50a3" (UID: "d9950d0b-d980-4e4f-82b4-9f616c6c50a3"). InnerVolumeSpecName "kube-api-access-8mfkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.329359 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d9950d0b-d980-4e4f-82b4-9f616c6c50a3" (UID: "d9950d0b-d980-4e4f-82b4-9f616c6c50a3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.329397 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d9950d0b-d980-4e4f-82b4-9f616c6c50a3" (UID: "d9950d0b-d980-4e4f-82b4-9f616c6c50a3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.329463 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "d9950d0b-d980-4e4f-82b4-9f616c6c50a3" (UID: "d9950d0b-d980-4e4f-82b4-9f616c6c50a3"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.329489 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-pod-info" (OuterVolumeSpecName: "pod-info") pod "d9950d0b-d980-4e4f-82b4-9f616c6c50a3" (UID: "d9950d0b-d980-4e4f-82b4-9f616c6c50a3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.372100 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-config-data" (OuterVolumeSpecName: "config-data") pod "d9950d0b-d980-4e4f-82b4-9f616c6c50a3" (UID: "d9950d0b-d980-4e4f-82b4-9f616c6c50a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.424426 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mfkt\" (UniqueName: \"kubernetes.io/projected/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-kube-api-access-8mfkt\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.424464 4923 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.424478 4923 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.424491 4923 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.424518 4923 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.424532 4923 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.424544 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.424553 4923 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-pod-info\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.424563 4923 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.453193 4923 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.454380 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-server-conf" (OuterVolumeSpecName: "server-conf") pod "d9950d0b-d980-4e4f-82b4-9f616c6c50a3" (UID: "d9950d0b-d980-4e4f-82b4-9f616c6c50a3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.474523 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d9950d0b-d980-4e4f-82b4-9f616c6c50a3" (UID: "d9950d0b-d980-4e4f-82b4-9f616c6c50a3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.526118 4923 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.527078 4923 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:17 crc kubenswrapper[4923]: I0224 03:17:17.527146 4923 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d9950d0b-d980-4e4f-82b4-9f616c6c50a3-server-conf\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.083661 4923 generic.go:334] "Generic (PLEG): container finished" podID="6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" containerID="f214badca1940336d886b38fbd14ffa06f3f3192e9d3d1cef2ec79f7bef5c6b0" exitCode=0 Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.085743 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0","Type":"ContainerDied","Data":"f214badca1940336d886b38fbd14ffa06f3f3192e9d3d1cef2ec79f7bef5c6b0"} Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.085925 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0","Type":"ContainerDied","Data":"bacef2b23fac862774c61524ba1bb06b4e1a584c35d7b2ef35bd60d0fed2d8b3"} Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.086010 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bacef2b23fac862774c61524ba1bb06b4e1a584c35d7b2ef35bd60d0fed2d8b3" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.088224 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.089736 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d9950d0b-d980-4e4f-82b4-9f616c6c50a3","Type":"ContainerDied","Data":"01fafa67e410af4f22a25e272dd86fd6d5177dcd60850a395c39f049657b860c"} Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.089783 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.089790 4923 scope.go:117] "RemoveContainer" containerID="713435589abcdf1b3efff239c9809bb5f70abfdf930ae064d9f58c2904554131" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.118198 4923 scope.go:117] "RemoveContainer" containerID="1ef05ab77af0e174ff2a3a3a25eb2a8838b22904e83d0e1d6e1693dfaaf19763" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.140741 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.194352 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.219447 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 24 03:17:18 crc kubenswrapper[4923]: E0224 03:17:18.219911 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9950d0b-d980-4e4f-82b4-9f616c6c50a3" containerName="setup-container" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.219944 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9950d0b-d980-4e4f-82b4-9f616c6c50a3" containerName="setup-container" Feb 24 03:17:18 crc kubenswrapper[4923]: E0224 03:17:18.219978 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" containerName="setup-container" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.219986 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" containerName="setup-container" Feb 24 03:17:18 crc kubenswrapper[4923]: E0224 03:17:18.220004 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9950d0b-d980-4e4f-82b4-9f616c6c50a3" containerName="rabbitmq" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.220014 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9950d0b-d980-4e4f-82b4-9f616c6c50a3" containerName="rabbitmq" Feb 24 03:17:18 crc kubenswrapper[4923]: E0224 03:17:18.220032 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" containerName="rabbitmq" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.220039 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" containerName="rabbitmq" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.220241 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" containerName="rabbitmq" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.220273 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9950d0b-d980-4e4f-82b4-9f616c6c50a3" containerName="rabbitmq" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.222041 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.228975 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.229442 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.229481 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.229673 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-jmcj4" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.230273 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.230513 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.230719 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.242587 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-server-conf\") pod \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.242755 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-tls\") pod \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.242876 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-erlang-cookie\") pod \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.242955 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-pod-info\") pod \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.243046 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-config-data\") pod \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.243162 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-plugins-conf\") pod \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.243246 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-confd\") pod \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.243342 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.263405 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-plugins\") pod \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.263749 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-erlang-cookie-secret\") pod \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.263832 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fg5q\" (UniqueName: \"kubernetes.io/projected/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-kube-api-access-4fg5q\") pod \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\" (UID: \"6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0\") " Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.269153 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" (UID: "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.247720 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.269785 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" (UID: "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.270383 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" (UID: "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.272270 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-pod-info" (OuterVolumeSpecName: "pod-info") pod "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" (UID: "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.279670 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" (UID: "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.282471 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" (UID: "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.296011 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" (UID: "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.319457 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-kube-api-access-4fg5q" (OuterVolumeSpecName: "kube-api-access-4fg5q") pod "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" (UID: "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0"). InnerVolumeSpecName "kube-api-access-4fg5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.360575 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-config-data" (OuterVolumeSpecName: "config-data") pod "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" (UID: "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.375286 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.375647 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-config-data\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.375672 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.375696 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.375732 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.375752 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.375785 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.375822 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.375865 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.375894 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gj5f\" (UniqueName: \"kubernetes.io/projected/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-kube-api-access-5gj5f\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.375925 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.375974 4923 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.375986 4923 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.375996 4923 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-pod-info\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.376004 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.376014 4923 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.376032 4923 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.376041 4923 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.376049 4923 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.376058 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fg5q\" (UniqueName: \"kubernetes.io/projected/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-kube-api-access-4fg5q\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.380058 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-server-conf" (OuterVolumeSpecName: "server-conf") pod "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" (UID: "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.449024 4923 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.478108 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gj5f\" (UniqueName: \"kubernetes.io/projected/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-kube-api-access-5gj5f\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.478164 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.478197 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.478218 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-config-data\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.478240 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.478261 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.478291 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.478382 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.478418 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.478487 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.478575 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.478640 4923 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-server-conf\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.478656 4923 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.478683 4923 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.480219 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.480634 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-config-data\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.480831 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.481255 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.481943 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.484072 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.484323 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.486266 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" (UID: "6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.501118 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.502229 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.512607 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gj5f\" (UniqueName: \"kubernetes.io/projected/4bd51e0b-15c9-4042-ac7e-c05ed0a11374-kube-api-access-5gj5f\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.526820 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"4bd51e0b-15c9-4042-ac7e-c05ed0a11374\") " pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.546569 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 24 03:17:18 crc kubenswrapper[4923]: I0224 03:17:18.579936 4923 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.078015 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 24 03:17:19 crc kubenswrapper[4923]: W0224 03:17:19.082075 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bd51e0b_15c9_4042_ac7e_c05ed0a11374.slice/crio-1745db0de1a40fac70d7455a8f28e1605fc9656614697f593b87b954b56e5be4 WatchSource:0}: Error finding container 1745db0de1a40fac70d7455a8f28e1605fc9656614697f593b87b954b56e5be4: Status 404 returned error can't find the container with id 1745db0de1a40fac70d7455a8f28e1605fc9656614697f593b87b954b56e5be4 Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.114484 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.114468 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4bd51e0b-15c9-4042-ac7e-c05ed0a11374","Type":"ContainerStarted","Data":"1745db0de1a40fac70d7455a8f28e1605fc9656614697f593b87b954b56e5be4"} Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.124882 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-g47dq"] Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.126276 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.137193 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.146855 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-g47dq"] Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.189968 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-config\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.190043 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-dns-svc\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.190069 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.190096 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6tg6\" (UniqueName: \"kubernetes.io/projected/0a590c80-1681-417c-b48e-b26f7f7a6222-kube-api-access-l6tg6\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.190118 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.190229 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.190332 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.236697 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.255612 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.284580 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.290159 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.292538 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.292830 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5hz2x" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.293700 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.293814 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.293927 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.294058 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-config\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.294114 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-dns-svc\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.294139 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.294176 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6tg6\" (UniqueName: \"kubernetes.io/projected/0a590c80-1681-417c-b48e-b26f7f7a6222-kube-api-access-l6tg6\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.294197 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.294237 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.294280 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.295660 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.294072 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.294095 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.296183 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-config\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.297093 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.297816 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.298238 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-dns-svc\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.299253 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.304833 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.318320 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6tg6\" (UniqueName: \"kubernetes.io/projected/0a590c80-1681-417c-b48e-b26f7f7a6222-kube-api-access-l6tg6\") pod \"dnsmasq-dns-d558885bc-g47dq\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.395393 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2976\" (UniqueName: \"kubernetes.io/projected/6e4608b5-cf65-4bbc-b509-85261127fe10-kube-api-access-c2976\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.395447 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6e4608b5-cf65-4bbc-b509-85261127fe10-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.395482 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e4608b5-cf65-4bbc-b509-85261127fe10-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.395515 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6e4608b5-cf65-4bbc-b509-85261127fe10-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.395651 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6e4608b5-cf65-4bbc-b509-85261127fe10-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.395749 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6e4608b5-cf65-4bbc-b509-85261127fe10-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.395779 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6e4608b5-cf65-4bbc-b509-85261127fe10-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.395849 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.395865 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6e4608b5-cf65-4bbc-b509-85261127fe10-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.395951 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6e4608b5-cf65-4bbc-b509-85261127fe10-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.396039 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6e4608b5-cf65-4bbc-b509-85261127fe10-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.498438 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6e4608b5-cf65-4bbc-b509-85261127fe10-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.498486 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6e4608b5-cf65-4bbc-b509-85261127fe10-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.498508 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6e4608b5-cf65-4bbc-b509-85261127fe10-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.498542 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.498575 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6e4608b5-cf65-4bbc-b509-85261127fe10-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.498592 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6e4608b5-cf65-4bbc-b509-85261127fe10-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.498622 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6e4608b5-cf65-4bbc-b509-85261127fe10-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.498691 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2976\" (UniqueName: \"kubernetes.io/projected/6e4608b5-cf65-4bbc-b509-85261127fe10-kube-api-access-c2976\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.498722 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6e4608b5-cf65-4bbc-b509-85261127fe10-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.498754 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e4608b5-cf65-4bbc-b509-85261127fe10-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.498807 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6e4608b5-cf65-4bbc-b509-85261127fe10-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.499688 4923 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.499704 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6e4608b5-cf65-4bbc-b509-85261127fe10-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.499880 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6e4608b5-cf65-4bbc-b509-85261127fe10-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.500494 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e4608b5-cf65-4bbc-b509-85261127fe10-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.500563 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6e4608b5-cf65-4bbc-b509-85261127fe10-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.502223 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6e4608b5-cf65-4bbc-b509-85261127fe10-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.502696 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6e4608b5-cf65-4bbc-b509-85261127fe10-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.504032 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6e4608b5-cf65-4bbc-b509-85261127fe10-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.504941 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6e4608b5-cf65-4bbc-b509-85261127fe10-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.505647 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6e4608b5-cf65-4bbc-b509-85261127fe10-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.520667 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2976\" (UniqueName: \"kubernetes.io/projected/6e4608b5-cf65-4bbc-b509-85261127fe10-kube-api-access-c2976\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.525936 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.536497 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e4608b5-cf65-4bbc-b509-85261127fe10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.660587 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.723571 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0" path="/var/lib/kubelet/pods/6a3b0fbb-1a43-4a8b-9c15-dff4832fbba0/volumes" Feb 24 03:17:19 crc kubenswrapper[4923]: I0224 03:17:19.724437 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9950d0b-d980-4e4f-82b4-9f616c6c50a3" path="/var/lib/kubelet/pods/d9950d0b-d980-4e4f-82b4-9f616c6c50a3/volumes" Feb 24 03:17:20 crc kubenswrapper[4923]: I0224 03:17:19.916701 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:17:20 crc kubenswrapper[4923]: I0224 03:17:19.916752 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:17:20 crc kubenswrapper[4923]: I0224 03:17:19.916793 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 03:17:20 crc kubenswrapper[4923]: I0224 03:17:19.917569 4923 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"84006aadd17b2e131a632622b49eac940374eaac532afbb7829f93e09553d367"} pod="openshift-machine-config-operator/machine-config-daemon-rh26t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 03:17:20 crc kubenswrapper[4923]: I0224 03:17:19.917624 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" containerID="cri-o://84006aadd17b2e131a632622b49eac940374eaac532afbb7829f93e09553d367" gracePeriod=600 Feb 24 03:17:20 crc kubenswrapper[4923]: W0224 03:17:19.990921 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a590c80_1681_417c_b48e_b26f7f7a6222.slice/crio-947a3ad9e4d84f692e36bf624cc7c1afbd3ce31a0d2f34bf19721ebf95371215 WatchSource:0}: Error finding container 947a3ad9e4d84f692e36bf624cc7c1afbd3ce31a0d2f34bf19721ebf95371215: Status 404 returned error can't find the container with id 947a3ad9e4d84f692e36bf624cc7c1afbd3ce31a0d2f34bf19721ebf95371215 Feb 24 03:17:20 crc kubenswrapper[4923]: I0224 03:17:19.995065 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-g47dq"] Feb 24 03:17:20 crc kubenswrapper[4923]: I0224 03:17:20.127106 4923 generic.go:334] "Generic (PLEG): container finished" podID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerID="84006aadd17b2e131a632622b49eac940374eaac532afbb7829f93e09553d367" exitCode=0 Feb 24 03:17:20 crc kubenswrapper[4923]: I0224 03:17:20.127209 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerDied","Data":"84006aadd17b2e131a632622b49eac940374eaac532afbb7829f93e09553d367"} Feb 24 03:17:20 crc kubenswrapper[4923]: I0224 03:17:20.127579 4923 scope.go:117] "RemoveContainer" containerID="40cb3d82b93cff9bd3bf829c2417332644f1c7038c262573b0f2c1eba50e9cc2" Feb 24 03:17:20 crc kubenswrapper[4923]: I0224 03:17:20.128715 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-g47dq" event={"ID":"0a590c80-1681-417c-b48e-b26f7f7a6222","Type":"ContainerStarted","Data":"947a3ad9e4d84f692e36bf624cc7c1afbd3ce31a0d2f34bf19721ebf95371215"} Feb 24 03:17:20 crc kubenswrapper[4923]: I0224 03:17:20.133884 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 24 03:17:21 crc kubenswrapper[4923]: I0224 03:17:21.146127 4923 generic.go:334] "Generic (PLEG): container finished" podID="0a590c80-1681-417c-b48e-b26f7f7a6222" containerID="f119e3d6ea463e455d0f567807894cc1ce27f753dec96c82b1164ca030a70c40" exitCode=0 Feb 24 03:17:21 crc kubenswrapper[4923]: I0224 03:17:21.147043 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-g47dq" event={"ID":"0a590c80-1681-417c-b48e-b26f7f7a6222","Type":"ContainerDied","Data":"f119e3d6ea463e455d0f567807894cc1ce27f753dec96c82b1164ca030a70c40"} Feb 24 03:17:21 crc kubenswrapper[4923]: I0224 03:17:21.149947 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e4608b5-cf65-4bbc-b509-85261127fe10","Type":"ContainerStarted","Data":"0ff87360eaee928cac133942885f552c1faf7af07dd4c6f11cf1344b2bd2b35a"} Feb 24 03:17:21 crc kubenswrapper[4923]: I0224 03:17:21.153735 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4bd51e0b-15c9-4042-ac7e-c05ed0a11374","Type":"ContainerStarted","Data":"a0c1b634fab8c18498683ca3e2c5b00aa4ac76ad1b0c3270c7e30b56f020c2fc"} Feb 24 03:17:21 crc kubenswrapper[4923]: I0224 03:17:21.162002 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerStarted","Data":"0faeb363e0b14f83a047ef04f8fa2df18f1991b14418890ba609de06ecd5c251"} Feb 24 03:17:22 crc kubenswrapper[4923]: I0224 03:17:22.179024 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-g47dq" event={"ID":"0a590c80-1681-417c-b48e-b26f7f7a6222","Type":"ContainerStarted","Data":"479254f6703bb231ee789daf18eca9029faaeba7cbaed6b872b94fe4372a0fbe"} Feb 24 03:17:22 crc kubenswrapper[4923]: I0224 03:17:22.180012 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:22 crc kubenswrapper[4923]: I0224 03:17:22.229971 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-g47dq" podStartSLOduration=3.229947985 podStartE2EDuration="3.229947985s" podCreationTimestamp="2026-02-24 03:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:17:22.220183172 +0000 UTC m=+1366.237253995" watchObservedRunningTime="2026-02-24 03:17:22.229947985 +0000 UTC m=+1366.247018808" Feb 24 03:17:23 crc kubenswrapper[4923]: I0224 03:17:23.208179 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e4608b5-cf65-4bbc-b509-85261127fe10","Type":"ContainerStarted","Data":"e0944d41e441893bb01904afe6fdc165eb65ca087764b64c447b00c4535a1c4e"} Feb 24 03:17:28 crc kubenswrapper[4923]: I0224 03:17:28.594633 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mqr7h"] Feb 24 03:17:28 crc kubenswrapper[4923]: I0224 03:17:28.600768 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqr7h" Feb 24 03:17:28 crc kubenswrapper[4923]: I0224 03:17:28.614314 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mqr7h"] Feb 24 03:17:28 crc kubenswrapper[4923]: I0224 03:17:28.699131 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29cc7af-b13d-4f78-8894-eb42055bb1e9-catalog-content\") pod \"redhat-operators-mqr7h\" (UID: \"b29cc7af-b13d-4f78-8894-eb42055bb1e9\") " pod="openshift-marketplace/redhat-operators-mqr7h" Feb 24 03:17:28 crc kubenswrapper[4923]: I0224 03:17:28.699207 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29cc7af-b13d-4f78-8894-eb42055bb1e9-utilities\") pod \"redhat-operators-mqr7h\" (UID: \"b29cc7af-b13d-4f78-8894-eb42055bb1e9\") " pod="openshift-marketplace/redhat-operators-mqr7h" Feb 24 03:17:28 crc kubenswrapper[4923]: I0224 03:17:28.699429 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78shj\" (UniqueName: \"kubernetes.io/projected/b29cc7af-b13d-4f78-8894-eb42055bb1e9-kube-api-access-78shj\") pod \"redhat-operators-mqr7h\" (UID: \"b29cc7af-b13d-4f78-8894-eb42055bb1e9\") " pod="openshift-marketplace/redhat-operators-mqr7h" Feb 24 03:17:28 crc kubenswrapper[4923]: I0224 03:17:28.801826 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78shj\" (UniqueName: \"kubernetes.io/projected/b29cc7af-b13d-4f78-8894-eb42055bb1e9-kube-api-access-78shj\") pod \"redhat-operators-mqr7h\" (UID: \"b29cc7af-b13d-4f78-8894-eb42055bb1e9\") " pod="openshift-marketplace/redhat-operators-mqr7h" Feb 24 03:17:28 crc kubenswrapper[4923]: I0224 03:17:28.801962 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29cc7af-b13d-4f78-8894-eb42055bb1e9-catalog-content\") pod \"redhat-operators-mqr7h\" (UID: \"b29cc7af-b13d-4f78-8894-eb42055bb1e9\") " pod="openshift-marketplace/redhat-operators-mqr7h" Feb 24 03:17:28 crc kubenswrapper[4923]: I0224 03:17:28.802027 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29cc7af-b13d-4f78-8894-eb42055bb1e9-utilities\") pod \"redhat-operators-mqr7h\" (UID: \"b29cc7af-b13d-4f78-8894-eb42055bb1e9\") " pod="openshift-marketplace/redhat-operators-mqr7h" Feb 24 03:17:28 crc kubenswrapper[4923]: I0224 03:17:28.802500 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29cc7af-b13d-4f78-8894-eb42055bb1e9-catalog-content\") pod \"redhat-operators-mqr7h\" (UID: \"b29cc7af-b13d-4f78-8894-eb42055bb1e9\") " pod="openshift-marketplace/redhat-operators-mqr7h" Feb 24 03:17:28 crc kubenswrapper[4923]: I0224 03:17:28.802553 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29cc7af-b13d-4f78-8894-eb42055bb1e9-utilities\") pod \"redhat-operators-mqr7h\" (UID: \"b29cc7af-b13d-4f78-8894-eb42055bb1e9\") " pod="openshift-marketplace/redhat-operators-mqr7h" Feb 24 03:17:28 crc kubenswrapper[4923]: I0224 03:17:28.823576 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78shj\" (UniqueName: \"kubernetes.io/projected/b29cc7af-b13d-4f78-8894-eb42055bb1e9-kube-api-access-78shj\") pod \"redhat-operators-mqr7h\" (UID: \"b29cc7af-b13d-4f78-8894-eb42055bb1e9\") " pod="openshift-marketplace/redhat-operators-mqr7h" Feb 24 03:17:28 crc kubenswrapper[4923]: I0224 03:17:28.936754 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqr7h" Feb 24 03:17:29 crc kubenswrapper[4923]: I0224 03:17:29.438324 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mqr7h"] Feb 24 03:17:29 crc kubenswrapper[4923]: I0224 03:17:29.528959 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:29 crc kubenswrapper[4923]: I0224 03:17:29.588226 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-g57rb"] Feb 24 03:17:29 crc kubenswrapper[4923]: I0224 03:17:29.588737 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" podUID="d73557bd-a940-406b-b9bb-9f43f09b5f77" containerName="dnsmasq-dns" containerID="cri-o://a119812fcbffa0f749bbf936b9908d146d0c37b20e201a1fdebcbb7055d20dfe" gracePeriod=10 Feb 24 03:17:29 crc kubenswrapper[4923]: I0224 03:17:29.803813 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-4v9fw"] Feb 24 03:17:29 crc kubenswrapper[4923]: I0224 03:17:29.807284 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:29 crc kubenswrapper[4923]: I0224 03:17:29.819339 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-4v9fw"] Feb 24 03:17:29 crc kubenswrapper[4923]: I0224 03:17:29.921327 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:29 crc kubenswrapper[4923]: I0224 03:17:29.921651 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqq5k\" (UniqueName: \"kubernetes.io/projected/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-kube-api-access-vqq5k\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:29 crc kubenswrapper[4923]: I0224 03:17:29.921699 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:29 crc kubenswrapper[4923]: I0224 03:17:29.921753 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-config\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:29 crc kubenswrapper[4923]: I0224 03:17:29.921773 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:29 crc kubenswrapper[4923]: I0224 03:17:29.921792 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:29 crc kubenswrapper[4923]: I0224 03:17:29.921816 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.023666 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.023734 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqq5k\" (UniqueName: \"kubernetes.io/projected/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-kube-api-access-vqq5k\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.023775 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.023826 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-config\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.023845 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.023862 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.023890 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.024743 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.025383 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.026136 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.026929 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.027338 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-config\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.027421 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.067478 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqq5k\" (UniqueName: \"kubernetes.io/projected/7ad3dfbc-174b-4b0c-9d41-a0c51eead210-kube-api-access-vqq5k\") pod \"dnsmasq-dns-78c64bc9c5-4v9fw\" (UID: \"7ad3dfbc-174b-4b0c-9d41-a0c51eead210\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.139169 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.197492 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.311966 4923 generic.go:334] "Generic (PLEG): container finished" podID="b29cc7af-b13d-4f78-8894-eb42055bb1e9" containerID="214793828b8e41dff6c4023dbfd959a0a19e1004a8a305e3888769d8e6c913ce" exitCode=0 Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.312190 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqr7h" event={"ID":"b29cc7af-b13d-4f78-8894-eb42055bb1e9","Type":"ContainerDied","Data":"214793828b8e41dff6c4023dbfd959a0a19e1004a8a305e3888769d8e6c913ce"} Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.312215 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqr7h" event={"ID":"b29cc7af-b13d-4f78-8894-eb42055bb1e9","Type":"ContainerStarted","Data":"a05eb28ac6a18e40c1a528ce7a0c66cd937590cab122f34d75817ae28d33cd40"} Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.324861 4923 generic.go:334] "Generic (PLEG): container finished" podID="d73557bd-a940-406b-b9bb-9f43f09b5f77" containerID="a119812fcbffa0f749bbf936b9908d146d0c37b20e201a1fdebcbb7055d20dfe" exitCode=0 Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.325086 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" event={"ID":"d73557bd-a940-406b-b9bb-9f43f09b5f77","Type":"ContainerDied","Data":"a119812fcbffa0f749bbf936b9908d146d0c37b20e201a1fdebcbb7055d20dfe"} Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.325094 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.325107 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-g57rb" event={"ID":"d73557bd-a940-406b-b9bb-9f43f09b5f77","Type":"ContainerDied","Data":"a83cc4666013b127452a26ddbdfb69a798af4e10808c99f9d042f360066015d1"} Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.325126 4923 scope.go:117] "RemoveContainer" containerID="a119812fcbffa0f749bbf936b9908d146d0c37b20e201a1fdebcbb7055d20dfe" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.327432 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-dns-swift-storage-0\") pod \"d73557bd-a940-406b-b9bb-9f43f09b5f77\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.327496 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-ovsdbserver-sb\") pod \"d73557bd-a940-406b-b9bb-9f43f09b5f77\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.327518 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-dns-svc\") pod \"d73557bd-a940-406b-b9bb-9f43f09b5f77\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.327627 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4trb\" (UniqueName: \"kubernetes.io/projected/d73557bd-a940-406b-b9bb-9f43f09b5f77-kube-api-access-g4trb\") pod \"d73557bd-a940-406b-b9bb-9f43f09b5f77\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.327699 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-ovsdbserver-nb\") pod \"d73557bd-a940-406b-b9bb-9f43f09b5f77\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.327738 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-config\") pod \"d73557bd-a940-406b-b9bb-9f43f09b5f77\" (UID: \"d73557bd-a940-406b-b9bb-9f43f09b5f77\") " Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.335465 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d73557bd-a940-406b-b9bb-9f43f09b5f77-kube-api-access-g4trb" (OuterVolumeSpecName: "kube-api-access-g4trb") pod "d73557bd-a940-406b-b9bb-9f43f09b5f77" (UID: "d73557bd-a940-406b-b9bb-9f43f09b5f77"). InnerVolumeSpecName "kube-api-access-g4trb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.355539 4923 scope.go:117] "RemoveContainer" containerID="b5b65b2aad6f39e5bd62f053d06cf3229e3eb555413a3b8e976799399df0bb36" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.393032 4923 scope.go:117] "RemoveContainer" containerID="a119812fcbffa0f749bbf936b9908d146d0c37b20e201a1fdebcbb7055d20dfe" Feb 24 03:17:30 crc kubenswrapper[4923]: E0224 03:17:30.393716 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a119812fcbffa0f749bbf936b9908d146d0c37b20e201a1fdebcbb7055d20dfe\": container with ID starting with a119812fcbffa0f749bbf936b9908d146d0c37b20e201a1fdebcbb7055d20dfe not found: ID does not exist" containerID="a119812fcbffa0f749bbf936b9908d146d0c37b20e201a1fdebcbb7055d20dfe" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.393751 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a119812fcbffa0f749bbf936b9908d146d0c37b20e201a1fdebcbb7055d20dfe"} err="failed to get container status \"a119812fcbffa0f749bbf936b9908d146d0c37b20e201a1fdebcbb7055d20dfe\": rpc error: code = NotFound desc = could not find container \"a119812fcbffa0f749bbf936b9908d146d0c37b20e201a1fdebcbb7055d20dfe\": container with ID starting with a119812fcbffa0f749bbf936b9908d146d0c37b20e201a1fdebcbb7055d20dfe not found: ID does not exist" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.393773 4923 scope.go:117] "RemoveContainer" containerID="b5b65b2aad6f39e5bd62f053d06cf3229e3eb555413a3b8e976799399df0bb36" Feb 24 03:17:30 crc kubenswrapper[4923]: E0224 03:17:30.396712 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5b65b2aad6f39e5bd62f053d06cf3229e3eb555413a3b8e976799399df0bb36\": container with ID starting with b5b65b2aad6f39e5bd62f053d06cf3229e3eb555413a3b8e976799399df0bb36 not found: ID does not exist" containerID="b5b65b2aad6f39e5bd62f053d06cf3229e3eb555413a3b8e976799399df0bb36" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.396744 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5b65b2aad6f39e5bd62f053d06cf3229e3eb555413a3b8e976799399df0bb36"} err="failed to get container status \"b5b65b2aad6f39e5bd62f053d06cf3229e3eb555413a3b8e976799399df0bb36\": rpc error: code = NotFound desc = could not find container \"b5b65b2aad6f39e5bd62f053d06cf3229e3eb555413a3b8e976799399df0bb36\": container with ID starting with b5b65b2aad6f39e5bd62f053d06cf3229e3eb555413a3b8e976799399df0bb36 not found: ID does not exist" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.412127 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d73557bd-a940-406b-b9bb-9f43f09b5f77" (UID: "d73557bd-a940-406b-b9bb-9f43f09b5f77"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.412315 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d73557bd-a940-406b-b9bb-9f43f09b5f77" (UID: "d73557bd-a940-406b-b9bb-9f43f09b5f77"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.422236 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d73557bd-a940-406b-b9bb-9f43f09b5f77" (UID: "d73557bd-a940-406b-b9bb-9f43f09b5f77"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.424059 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-config" (OuterVolumeSpecName: "config") pod "d73557bd-a940-406b-b9bb-9f43f09b5f77" (UID: "d73557bd-a940-406b-b9bb-9f43f09b5f77"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.430601 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.430627 4923 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.430637 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.430648 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4trb\" (UniqueName: \"kubernetes.io/projected/d73557bd-a940-406b-b9bb-9f43f09b5f77-kube-api-access-g4trb\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.430659 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.450157 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d73557bd-a940-406b-b9bb-9f43f09b5f77" (UID: "d73557bd-a940-406b-b9bb-9f43f09b5f77"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.532327 4923 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d73557bd-a940-406b-b9bb-9f43f09b5f77-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.661852 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-g57rb"] Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.669984 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-g57rb"] Feb 24 03:17:30 crc kubenswrapper[4923]: I0224 03:17:30.755407 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-4v9fw"] Feb 24 03:17:30 crc kubenswrapper[4923]: W0224 03:17:30.759993 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ad3dfbc_174b_4b0c_9d41_a0c51eead210.slice/crio-0d231eb80a2dee6d90cb94ac079e5a4fab0510516b63e6c172dd4ea5ddbb13b2 WatchSource:0}: Error finding container 0d231eb80a2dee6d90cb94ac079e5a4fab0510516b63e6c172dd4ea5ddbb13b2: Status 404 returned error can't find the container with id 0d231eb80a2dee6d90cb94ac079e5a4fab0510516b63e6c172dd4ea5ddbb13b2 Feb 24 03:17:31 crc kubenswrapper[4923]: I0224 03:17:31.335887 4923 generic.go:334] "Generic (PLEG): container finished" podID="7ad3dfbc-174b-4b0c-9d41-a0c51eead210" containerID="51e5a948386dbdec25f2ab04134fdb895376b346f9ed85501a49b7f960794c5a" exitCode=0 Feb 24 03:17:31 crc kubenswrapper[4923]: I0224 03:17:31.336003 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" event={"ID":"7ad3dfbc-174b-4b0c-9d41-a0c51eead210","Type":"ContainerDied","Data":"51e5a948386dbdec25f2ab04134fdb895376b346f9ed85501a49b7f960794c5a"} Feb 24 03:17:31 crc kubenswrapper[4923]: I0224 03:17:31.336188 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" event={"ID":"7ad3dfbc-174b-4b0c-9d41-a0c51eead210","Type":"ContainerStarted","Data":"0d231eb80a2dee6d90cb94ac079e5a4fab0510516b63e6c172dd4ea5ddbb13b2"} Feb 24 03:17:31 crc kubenswrapper[4923]: I0224 03:17:31.337984 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqr7h" event={"ID":"b29cc7af-b13d-4f78-8894-eb42055bb1e9","Type":"ContainerStarted","Data":"4e6ddd32405b1368540bc4d413bb712c4e09300fe462b23d1826269040a63c5d"} Feb 24 03:17:31 crc kubenswrapper[4923]: I0224 03:17:31.728133 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d73557bd-a940-406b-b9bb-9f43f09b5f77" path="/var/lib/kubelet/pods/d73557bd-a940-406b-b9bb-9f43f09b5f77/volumes" Feb 24 03:17:32 crc kubenswrapper[4923]: I0224 03:17:32.353069 4923 generic.go:334] "Generic (PLEG): container finished" podID="b29cc7af-b13d-4f78-8894-eb42055bb1e9" containerID="4e6ddd32405b1368540bc4d413bb712c4e09300fe462b23d1826269040a63c5d" exitCode=0 Feb 24 03:17:32 crc kubenswrapper[4923]: I0224 03:17:32.353167 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqr7h" event={"ID":"b29cc7af-b13d-4f78-8894-eb42055bb1e9","Type":"ContainerDied","Data":"4e6ddd32405b1368540bc4d413bb712c4e09300fe462b23d1826269040a63c5d"} Feb 24 03:17:32 crc kubenswrapper[4923]: I0224 03:17:32.357022 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" event={"ID":"7ad3dfbc-174b-4b0c-9d41-a0c51eead210","Type":"ContainerStarted","Data":"742941c749a3cbe6b9713bc666f7713ddabc15b07fadbbe7856486eb143bbcef"} Feb 24 03:17:32 crc kubenswrapper[4923]: I0224 03:17:32.357278 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:32 crc kubenswrapper[4923]: I0224 03:17:32.423160 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" podStartSLOduration=3.423134627 podStartE2EDuration="3.423134627s" podCreationTimestamp="2026-02-24 03:17:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:17:32.418950919 +0000 UTC m=+1376.436021772" watchObservedRunningTime="2026-02-24 03:17:32.423134627 +0000 UTC m=+1376.440205480" Feb 24 03:17:33 crc kubenswrapper[4923]: I0224 03:17:33.372540 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqr7h" event={"ID":"b29cc7af-b13d-4f78-8894-eb42055bb1e9","Type":"ContainerStarted","Data":"79c0a0c3630dd65b07c142f20967e969f34bf13ff178eddb6943b582aab7c9e6"} Feb 24 03:17:38 crc kubenswrapper[4923]: I0224 03:17:38.937836 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mqr7h" Feb 24 03:17:38 crc kubenswrapper[4923]: I0224 03:17:38.938613 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mqr7h" Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.036706 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mqr7h" podUID="b29cc7af-b13d-4f78-8894-eb42055bb1e9" containerName="registry-server" probeResult="failure" output=< Feb 24 03:17:40 crc kubenswrapper[4923]: timeout: failed to connect service ":50051" within 1s Feb 24 03:17:40 crc kubenswrapper[4923]: > Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.199477 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78c64bc9c5-4v9fw" Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.220426 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mqr7h" podStartSLOduration=9.722829053 podStartE2EDuration="12.220408837s" podCreationTimestamp="2026-02-24 03:17:28 +0000 UTC" firstStartedPulling="2026-02-24 03:17:30.314638576 +0000 UTC m=+1374.331709389" lastFinishedPulling="2026-02-24 03:17:32.81221835 +0000 UTC m=+1376.829289173" observedRunningTime="2026-02-24 03:17:33.401072035 +0000 UTC m=+1377.418142888" watchObservedRunningTime="2026-02-24 03:17:40.220408837 +0000 UTC m=+1384.237479650" Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.266867 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-g47dq"] Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.267139 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-g47dq" podUID="0a590c80-1681-417c-b48e-b26f7f7a6222" containerName="dnsmasq-dns" containerID="cri-o://479254f6703bb231ee789daf18eca9029faaeba7cbaed6b872b94fe4372a0fbe" gracePeriod=10 Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.462619 4923 generic.go:334] "Generic (PLEG): container finished" podID="0a590c80-1681-417c-b48e-b26f7f7a6222" containerID="479254f6703bb231ee789daf18eca9029faaeba7cbaed6b872b94fe4372a0fbe" exitCode=0 Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.462699 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-g47dq" event={"ID":"0a590c80-1681-417c-b48e-b26f7f7a6222","Type":"ContainerDied","Data":"479254f6703bb231ee789daf18eca9029faaeba7cbaed6b872b94fe4372a0fbe"} Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.740375 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.859000 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-dns-swift-storage-0\") pod \"0a590c80-1681-417c-b48e-b26f7f7a6222\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.859116 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-dns-svc\") pod \"0a590c80-1681-417c-b48e-b26f7f7a6222\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.859209 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6tg6\" (UniqueName: \"kubernetes.io/projected/0a590c80-1681-417c-b48e-b26f7f7a6222-kube-api-access-l6tg6\") pod \"0a590c80-1681-417c-b48e-b26f7f7a6222\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.859233 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-ovsdbserver-sb\") pod \"0a590c80-1681-417c-b48e-b26f7f7a6222\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.859276 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-openstack-edpm-ipam\") pod \"0a590c80-1681-417c-b48e-b26f7f7a6222\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.859329 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-ovsdbserver-nb\") pod \"0a590c80-1681-417c-b48e-b26f7f7a6222\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.859402 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-config\") pod \"0a590c80-1681-417c-b48e-b26f7f7a6222\" (UID: \"0a590c80-1681-417c-b48e-b26f7f7a6222\") " Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.868852 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a590c80-1681-417c-b48e-b26f7f7a6222-kube-api-access-l6tg6" (OuterVolumeSpecName: "kube-api-access-l6tg6") pod "0a590c80-1681-417c-b48e-b26f7f7a6222" (UID: "0a590c80-1681-417c-b48e-b26f7f7a6222"). InnerVolumeSpecName "kube-api-access-l6tg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.931110 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "0a590c80-1681-417c-b48e-b26f7f7a6222" (UID: "0a590c80-1681-417c-b48e-b26f7f7a6222"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.931340 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0a590c80-1681-417c-b48e-b26f7f7a6222" (UID: "0a590c80-1681-417c-b48e-b26f7f7a6222"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.939838 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0a590c80-1681-417c-b48e-b26f7f7a6222" (UID: "0a590c80-1681-417c-b48e-b26f7f7a6222"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.940226 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0a590c80-1681-417c-b48e-b26f7f7a6222" (UID: "0a590c80-1681-417c-b48e-b26f7f7a6222"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.945458 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-config" (OuterVolumeSpecName: "config") pod "0a590c80-1681-417c-b48e-b26f7f7a6222" (UID: "0a590c80-1681-417c-b48e-b26f7f7a6222"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.961369 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.961403 4923 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.961415 4923 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.961426 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6tg6\" (UniqueName: \"kubernetes.io/projected/0a590c80-1681-417c-b48e-b26f7f7a6222-kube-api-access-l6tg6\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.961437 4923 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.961446 4923 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:40 crc kubenswrapper[4923]: I0224 03:17:40.961764 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0a590c80-1681-417c-b48e-b26f7f7a6222" (UID: "0a590c80-1681-417c-b48e-b26f7f7a6222"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:17:41 crc kubenswrapper[4923]: I0224 03:17:41.062623 4923 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a590c80-1681-417c-b48e-b26f7f7a6222-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:41 crc kubenswrapper[4923]: I0224 03:17:41.477209 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-g47dq" event={"ID":"0a590c80-1681-417c-b48e-b26f7f7a6222","Type":"ContainerDied","Data":"947a3ad9e4d84f692e36bf624cc7c1afbd3ce31a0d2f34bf19721ebf95371215"} Feb 24 03:17:41 crc kubenswrapper[4923]: I0224 03:17:41.477263 4923 scope.go:117] "RemoveContainer" containerID="479254f6703bb231ee789daf18eca9029faaeba7cbaed6b872b94fe4372a0fbe" Feb 24 03:17:41 crc kubenswrapper[4923]: I0224 03:17:41.477289 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-g47dq" Feb 24 03:17:41 crc kubenswrapper[4923]: I0224 03:17:41.514029 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-g47dq"] Feb 24 03:17:41 crc kubenswrapper[4923]: I0224 03:17:41.518672 4923 scope.go:117] "RemoveContainer" containerID="f119e3d6ea463e455d0f567807894cc1ce27f753dec96c82b1164ca030a70c40" Feb 24 03:17:41 crc kubenswrapper[4923]: I0224 03:17:41.522169 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-g47dq"] Feb 24 03:17:41 crc kubenswrapper[4923]: I0224 03:17:41.730938 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a590c80-1681-417c-b48e-b26f7f7a6222" path="/var/lib/kubelet/pods/0a590c80-1681-417c-b48e-b26f7f7a6222/volumes" Feb 24 03:17:48 crc kubenswrapper[4923]: I0224 03:17:48.992031 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6"] Feb 24 03:17:48 crc kubenswrapper[4923]: E0224 03:17:48.993009 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a590c80-1681-417c-b48e-b26f7f7a6222" containerName="init" Feb 24 03:17:48 crc kubenswrapper[4923]: I0224 03:17:48.993028 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a590c80-1681-417c-b48e-b26f7f7a6222" containerName="init" Feb 24 03:17:48 crc kubenswrapper[4923]: E0224 03:17:48.993057 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d73557bd-a940-406b-b9bb-9f43f09b5f77" containerName="dnsmasq-dns" Feb 24 03:17:48 crc kubenswrapper[4923]: I0224 03:17:48.993065 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73557bd-a940-406b-b9bb-9f43f09b5f77" containerName="dnsmasq-dns" Feb 24 03:17:48 crc kubenswrapper[4923]: E0224 03:17:48.993088 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d73557bd-a940-406b-b9bb-9f43f09b5f77" containerName="init" Feb 24 03:17:48 crc kubenswrapper[4923]: I0224 03:17:48.993096 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73557bd-a940-406b-b9bb-9f43f09b5f77" containerName="init" Feb 24 03:17:48 crc kubenswrapper[4923]: E0224 03:17:48.993117 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a590c80-1681-417c-b48e-b26f7f7a6222" containerName="dnsmasq-dns" Feb 24 03:17:48 crc kubenswrapper[4923]: I0224 03:17:48.993125 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a590c80-1681-417c-b48e-b26f7f7a6222" containerName="dnsmasq-dns" Feb 24 03:17:48 crc kubenswrapper[4923]: I0224 03:17:48.993378 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="d73557bd-a940-406b-b9bb-9f43f09b5f77" containerName="dnsmasq-dns" Feb 24 03:17:48 crc kubenswrapper[4923]: I0224 03:17:48.993405 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a590c80-1681-417c-b48e-b26f7f7a6222" containerName="dnsmasq-dns" Feb 24 03:17:48 crc kubenswrapper[4923]: I0224 03:17:48.994098 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.006733 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.006830 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fgpt8" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.007793 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mqr7h" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.009980 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.011170 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.059927 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6"] Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.108047 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mqr7h" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.119075 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpv9g\" (UniqueName: \"kubernetes.io/projected/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-kube-api-access-qpv9g\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6\" (UID: \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.119201 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6\" (UID: \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.120186 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6\" (UID: \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.120349 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6\" (UID: \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.222759 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6\" (UID: \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.222850 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6\" (UID: \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.223172 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpv9g\" (UniqueName: \"kubernetes.io/projected/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-kube-api-access-qpv9g\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6\" (UID: \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.223235 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6\" (UID: \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.232851 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6\" (UID: \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.232957 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6\" (UID: \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.234876 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6\" (UID: \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.245577 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpv9g\" (UniqueName: \"kubernetes.io/projected/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-kube-api-access-qpv9g\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6\" (UID: \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.280280 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mqr7h"] Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.352027 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" Feb 24 03:17:49 crc kubenswrapper[4923]: I0224 03:17:49.920525 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6"] Feb 24 03:17:50 crc kubenswrapper[4923]: I0224 03:17:50.585415 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" event={"ID":"d5eb03b7-77c3-4c05-a735-ce0c901c91cb","Type":"ContainerStarted","Data":"df31fb7a7a2578e6d2e641fa9cfa39dc51725a57a34fc2dd13aad17ed59dd18c"} Feb 24 03:17:50 crc kubenswrapper[4923]: I0224 03:17:50.585468 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mqr7h" podUID="b29cc7af-b13d-4f78-8894-eb42055bb1e9" containerName="registry-server" containerID="cri-o://79c0a0c3630dd65b07c142f20967e969f34bf13ff178eddb6943b582aab7c9e6" gracePeriod=2 Feb 24 03:17:51 crc kubenswrapper[4923]: I0224 03:17:51.023841 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqr7h" Feb 24 03:17:51 crc kubenswrapper[4923]: I0224 03:17:51.158255 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29cc7af-b13d-4f78-8894-eb42055bb1e9-catalog-content\") pod \"b29cc7af-b13d-4f78-8894-eb42055bb1e9\" (UID: \"b29cc7af-b13d-4f78-8894-eb42055bb1e9\") " Feb 24 03:17:51 crc kubenswrapper[4923]: I0224 03:17:51.158406 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78shj\" (UniqueName: \"kubernetes.io/projected/b29cc7af-b13d-4f78-8894-eb42055bb1e9-kube-api-access-78shj\") pod \"b29cc7af-b13d-4f78-8894-eb42055bb1e9\" (UID: \"b29cc7af-b13d-4f78-8894-eb42055bb1e9\") " Feb 24 03:17:51 crc kubenswrapper[4923]: I0224 03:17:51.158445 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29cc7af-b13d-4f78-8894-eb42055bb1e9-utilities\") pod \"b29cc7af-b13d-4f78-8894-eb42055bb1e9\" (UID: \"b29cc7af-b13d-4f78-8894-eb42055bb1e9\") " Feb 24 03:17:51 crc kubenswrapper[4923]: I0224 03:17:51.159496 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b29cc7af-b13d-4f78-8894-eb42055bb1e9-utilities" (OuterVolumeSpecName: "utilities") pod "b29cc7af-b13d-4f78-8894-eb42055bb1e9" (UID: "b29cc7af-b13d-4f78-8894-eb42055bb1e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:17:51 crc kubenswrapper[4923]: I0224 03:17:51.166620 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b29cc7af-b13d-4f78-8894-eb42055bb1e9-kube-api-access-78shj" (OuterVolumeSpecName: "kube-api-access-78shj") pod "b29cc7af-b13d-4f78-8894-eb42055bb1e9" (UID: "b29cc7af-b13d-4f78-8894-eb42055bb1e9"). InnerVolumeSpecName "kube-api-access-78shj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:17:51 crc kubenswrapper[4923]: I0224 03:17:51.260545 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78shj\" (UniqueName: \"kubernetes.io/projected/b29cc7af-b13d-4f78-8894-eb42055bb1e9-kube-api-access-78shj\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:51 crc kubenswrapper[4923]: I0224 03:17:51.260577 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29cc7af-b13d-4f78-8894-eb42055bb1e9-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:51 crc kubenswrapper[4923]: I0224 03:17:51.260587 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b29cc7af-b13d-4f78-8894-eb42055bb1e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b29cc7af-b13d-4f78-8894-eb42055bb1e9" (UID: "b29cc7af-b13d-4f78-8894-eb42055bb1e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:17:51 crc kubenswrapper[4923]: I0224 03:17:51.361867 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29cc7af-b13d-4f78-8894-eb42055bb1e9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:17:51 crc kubenswrapper[4923]: I0224 03:17:51.594527 4923 generic.go:334] "Generic (PLEG): container finished" podID="b29cc7af-b13d-4f78-8894-eb42055bb1e9" containerID="79c0a0c3630dd65b07c142f20967e969f34bf13ff178eddb6943b582aab7c9e6" exitCode=0 Feb 24 03:17:51 crc kubenswrapper[4923]: I0224 03:17:51.594569 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqr7h" event={"ID":"b29cc7af-b13d-4f78-8894-eb42055bb1e9","Type":"ContainerDied","Data":"79c0a0c3630dd65b07c142f20967e969f34bf13ff178eddb6943b582aab7c9e6"} Feb 24 03:17:51 crc kubenswrapper[4923]: I0224 03:17:51.594594 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mqr7h" event={"ID":"b29cc7af-b13d-4f78-8894-eb42055bb1e9","Type":"ContainerDied","Data":"a05eb28ac6a18e40c1a528ce7a0c66cd937590cab122f34d75817ae28d33cd40"} Feb 24 03:17:51 crc kubenswrapper[4923]: I0224 03:17:51.594611 4923 scope.go:117] "RemoveContainer" containerID="79c0a0c3630dd65b07c142f20967e969f34bf13ff178eddb6943b582aab7c9e6" Feb 24 03:17:51 crc kubenswrapper[4923]: I0224 03:17:51.594721 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mqr7h" Feb 24 03:17:51 crc kubenswrapper[4923]: I0224 03:17:51.634820 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mqr7h"] Feb 24 03:17:51 crc kubenswrapper[4923]: I0224 03:17:51.643802 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mqr7h"] Feb 24 03:17:51 crc kubenswrapper[4923]: I0224 03:17:51.723225 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b29cc7af-b13d-4f78-8894-eb42055bb1e9" path="/var/lib/kubelet/pods/b29cc7af-b13d-4f78-8894-eb42055bb1e9/volumes" Feb 24 03:17:52 crc kubenswrapper[4923]: I0224 03:17:52.446403 4923 scope.go:117] "RemoveContainer" containerID="4e6ddd32405b1368540bc4d413bb712c4e09300fe462b23d1826269040a63c5d" Feb 24 03:17:52 crc kubenswrapper[4923]: I0224 03:17:52.483354 4923 scope.go:117] "RemoveContainer" containerID="214793828b8e41dff6c4023dbfd959a0a19e1004a8a305e3888769d8e6c913ce" Feb 24 03:17:52 crc kubenswrapper[4923]: I0224 03:17:52.520415 4923 scope.go:117] "RemoveContainer" containerID="79c0a0c3630dd65b07c142f20967e969f34bf13ff178eddb6943b582aab7c9e6" Feb 24 03:17:52 crc kubenswrapper[4923]: E0224 03:17:52.520961 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79c0a0c3630dd65b07c142f20967e969f34bf13ff178eddb6943b582aab7c9e6\": container with ID starting with 79c0a0c3630dd65b07c142f20967e969f34bf13ff178eddb6943b582aab7c9e6 not found: ID does not exist" containerID="79c0a0c3630dd65b07c142f20967e969f34bf13ff178eddb6943b582aab7c9e6" Feb 24 03:17:52 crc kubenswrapper[4923]: I0224 03:17:52.521007 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79c0a0c3630dd65b07c142f20967e969f34bf13ff178eddb6943b582aab7c9e6"} err="failed to get container status \"79c0a0c3630dd65b07c142f20967e969f34bf13ff178eddb6943b582aab7c9e6\": rpc error: code = NotFound desc = could not find container \"79c0a0c3630dd65b07c142f20967e969f34bf13ff178eddb6943b582aab7c9e6\": container with ID starting with 79c0a0c3630dd65b07c142f20967e969f34bf13ff178eddb6943b582aab7c9e6 not found: ID does not exist" Feb 24 03:17:52 crc kubenswrapper[4923]: I0224 03:17:52.521032 4923 scope.go:117] "RemoveContainer" containerID="4e6ddd32405b1368540bc4d413bb712c4e09300fe462b23d1826269040a63c5d" Feb 24 03:17:52 crc kubenswrapper[4923]: E0224 03:17:52.521539 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e6ddd32405b1368540bc4d413bb712c4e09300fe462b23d1826269040a63c5d\": container with ID starting with 4e6ddd32405b1368540bc4d413bb712c4e09300fe462b23d1826269040a63c5d not found: ID does not exist" containerID="4e6ddd32405b1368540bc4d413bb712c4e09300fe462b23d1826269040a63c5d" Feb 24 03:17:52 crc kubenswrapper[4923]: I0224 03:17:52.521614 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e6ddd32405b1368540bc4d413bb712c4e09300fe462b23d1826269040a63c5d"} err="failed to get container status \"4e6ddd32405b1368540bc4d413bb712c4e09300fe462b23d1826269040a63c5d\": rpc error: code = NotFound desc = could not find container \"4e6ddd32405b1368540bc4d413bb712c4e09300fe462b23d1826269040a63c5d\": container with ID starting with 4e6ddd32405b1368540bc4d413bb712c4e09300fe462b23d1826269040a63c5d not found: ID does not exist" Feb 24 03:17:52 crc kubenswrapper[4923]: I0224 03:17:52.521648 4923 scope.go:117] "RemoveContainer" containerID="214793828b8e41dff6c4023dbfd959a0a19e1004a8a305e3888769d8e6c913ce" Feb 24 03:17:52 crc kubenswrapper[4923]: E0224 03:17:52.522227 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"214793828b8e41dff6c4023dbfd959a0a19e1004a8a305e3888769d8e6c913ce\": container with ID starting with 214793828b8e41dff6c4023dbfd959a0a19e1004a8a305e3888769d8e6c913ce not found: ID does not exist" containerID="214793828b8e41dff6c4023dbfd959a0a19e1004a8a305e3888769d8e6c913ce" Feb 24 03:17:52 crc kubenswrapper[4923]: I0224 03:17:52.522257 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"214793828b8e41dff6c4023dbfd959a0a19e1004a8a305e3888769d8e6c913ce"} err="failed to get container status \"214793828b8e41dff6c4023dbfd959a0a19e1004a8a305e3888769d8e6c913ce\": rpc error: code = NotFound desc = could not find container \"214793828b8e41dff6c4023dbfd959a0a19e1004a8a305e3888769d8e6c913ce\": container with ID starting with 214793828b8e41dff6c4023dbfd959a0a19e1004a8a305e3888769d8e6c913ce not found: ID does not exist" Feb 24 03:17:54 crc kubenswrapper[4923]: I0224 03:17:54.625730 4923 generic.go:334] "Generic (PLEG): container finished" podID="4bd51e0b-15c9-4042-ac7e-c05ed0a11374" containerID="a0c1b634fab8c18498683ca3e2c5b00aa4ac76ad1b0c3270c7e30b56f020c2fc" exitCode=0 Feb 24 03:17:54 crc kubenswrapper[4923]: I0224 03:17:54.625839 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4bd51e0b-15c9-4042-ac7e-c05ed0a11374","Type":"ContainerDied","Data":"a0c1b634fab8c18498683ca3e2c5b00aa4ac76ad1b0c3270c7e30b56f020c2fc"} Feb 24 03:17:55 crc kubenswrapper[4923]: I0224 03:17:55.636366 4923 generic.go:334] "Generic (PLEG): container finished" podID="6e4608b5-cf65-4bbc-b509-85261127fe10" containerID="e0944d41e441893bb01904afe6fdc165eb65ca087764b64c447b00c4535a1c4e" exitCode=0 Feb 24 03:17:55 crc kubenswrapper[4923]: I0224 03:17:55.636434 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e4608b5-cf65-4bbc-b509-85261127fe10","Type":"ContainerDied","Data":"e0944d41e441893bb01904afe6fdc165eb65ca087764b64c447b00c4535a1c4e"} Feb 24 03:17:58 crc kubenswrapper[4923]: I0224 03:17:58.668602 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e4608b5-cf65-4bbc-b509-85261127fe10","Type":"ContainerStarted","Data":"d02d73caba95f24cceb52b3a7e0e73d56a01ff5f9d029a5fc80391c3282c05d5"} Feb 24 03:17:58 crc kubenswrapper[4923]: I0224 03:17:58.669387 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:17:58 crc kubenswrapper[4923]: I0224 03:17:58.678978 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4bd51e0b-15c9-4042-ac7e-c05ed0a11374","Type":"ContainerStarted","Data":"37d2b05ecd451472fac64fe64625c3b60e8b78cacfd20a37f5bc4ca381720e68"} Feb 24 03:17:58 crc kubenswrapper[4923]: I0224 03:17:58.679581 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 24 03:17:58 crc kubenswrapper[4923]: I0224 03:17:58.681284 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" event={"ID":"d5eb03b7-77c3-4c05-a735-ce0c901c91cb","Type":"ContainerStarted","Data":"a28347231c9cfe7276ae84fa97eca547dafba7c98c7852b5ad4a56621ba956b5"} Feb 24 03:17:58 crc kubenswrapper[4923]: I0224 03:17:58.697153 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.697132407 podStartE2EDuration="39.697132407s" podCreationTimestamp="2026-02-24 03:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:17:58.693275077 +0000 UTC m=+1402.710345880" watchObservedRunningTime="2026-02-24 03:17:58.697132407 +0000 UTC m=+1402.714203230" Feb 24 03:17:58 crc kubenswrapper[4923]: I0224 03:17:58.722083 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.722061285 podStartE2EDuration="40.722061285s" podCreationTimestamp="2026-02-24 03:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:17:58.715200517 +0000 UTC m=+1402.732271330" watchObservedRunningTime="2026-02-24 03:17:58.722061285 +0000 UTC m=+1402.739132098" Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.069840 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" podStartSLOduration=8.702423782 podStartE2EDuration="17.06981329s" podCreationTimestamp="2026-02-24 03:17:48 +0000 UTC" firstStartedPulling="2026-02-24 03:17:49.924974739 +0000 UTC m=+1393.942045542" lastFinishedPulling="2026-02-24 03:17:58.292364227 +0000 UTC m=+1402.309435050" observedRunningTime="2026-02-24 03:17:58.743694678 +0000 UTC m=+1402.760765481" watchObservedRunningTime="2026-02-24 03:18:05.06981329 +0000 UTC m=+1409.086884113" Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.075190 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-29ckd"] Feb 24 03:18:05 crc kubenswrapper[4923]: E0224 03:18:05.075799 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b29cc7af-b13d-4f78-8894-eb42055bb1e9" containerName="extract-utilities" Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.075832 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29cc7af-b13d-4f78-8894-eb42055bb1e9" containerName="extract-utilities" Feb 24 03:18:05 crc kubenswrapper[4923]: E0224 03:18:05.075853 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b29cc7af-b13d-4f78-8894-eb42055bb1e9" containerName="registry-server" Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.075865 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29cc7af-b13d-4f78-8894-eb42055bb1e9" containerName="registry-server" Feb 24 03:18:05 crc kubenswrapper[4923]: E0224 03:18:05.075896 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b29cc7af-b13d-4f78-8894-eb42055bb1e9" containerName="extract-content" Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.075908 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29cc7af-b13d-4f78-8894-eb42055bb1e9" containerName="extract-content" Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.077234 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="b29cc7af-b13d-4f78-8894-eb42055bb1e9" containerName="registry-server" Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.086599 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29ckd" Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.100060 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-29ckd"] Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.169908 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c446e1e9-ba6a-40ed-81d4-9e6df663a230-catalog-content\") pod \"community-operators-29ckd\" (UID: \"c446e1e9-ba6a-40ed-81d4-9e6df663a230\") " pod="openshift-marketplace/community-operators-29ckd" Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.170227 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8r5t\" (UniqueName: \"kubernetes.io/projected/c446e1e9-ba6a-40ed-81d4-9e6df663a230-kube-api-access-k8r5t\") pod \"community-operators-29ckd\" (UID: \"c446e1e9-ba6a-40ed-81d4-9e6df663a230\") " pod="openshift-marketplace/community-operators-29ckd" Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.170477 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c446e1e9-ba6a-40ed-81d4-9e6df663a230-utilities\") pod \"community-operators-29ckd\" (UID: \"c446e1e9-ba6a-40ed-81d4-9e6df663a230\") " pod="openshift-marketplace/community-operators-29ckd" Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.272073 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c446e1e9-ba6a-40ed-81d4-9e6df663a230-utilities\") pod \"community-operators-29ckd\" (UID: \"c446e1e9-ba6a-40ed-81d4-9e6df663a230\") " pod="openshift-marketplace/community-operators-29ckd" Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.272181 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c446e1e9-ba6a-40ed-81d4-9e6df663a230-catalog-content\") pod \"community-operators-29ckd\" (UID: \"c446e1e9-ba6a-40ed-81d4-9e6df663a230\") " pod="openshift-marketplace/community-operators-29ckd" Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.272215 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8r5t\" (UniqueName: \"kubernetes.io/projected/c446e1e9-ba6a-40ed-81d4-9e6df663a230-kube-api-access-k8r5t\") pod \"community-operators-29ckd\" (UID: \"c446e1e9-ba6a-40ed-81d4-9e6df663a230\") " pod="openshift-marketplace/community-operators-29ckd" Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.272716 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c446e1e9-ba6a-40ed-81d4-9e6df663a230-catalog-content\") pod \"community-operators-29ckd\" (UID: \"c446e1e9-ba6a-40ed-81d4-9e6df663a230\") " pod="openshift-marketplace/community-operators-29ckd" Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.272715 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c446e1e9-ba6a-40ed-81d4-9e6df663a230-utilities\") pod \"community-operators-29ckd\" (UID: \"c446e1e9-ba6a-40ed-81d4-9e6df663a230\") " pod="openshift-marketplace/community-operators-29ckd" Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.298935 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8r5t\" (UniqueName: \"kubernetes.io/projected/c446e1e9-ba6a-40ed-81d4-9e6df663a230-kube-api-access-k8r5t\") pod \"community-operators-29ckd\" (UID: \"c446e1e9-ba6a-40ed-81d4-9e6df663a230\") " pod="openshift-marketplace/community-operators-29ckd" Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.425538 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29ckd" Feb 24 03:18:05 crc kubenswrapper[4923]: I0224 03:18:05.949002 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-29ckd"] Feb 24 03:18:05 crc kubenswrapper[4923]: W0224 03:18:05.953518 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc446e1e9_ba6a_40ed_81d4_9e6df663a230.slice/crio-52eef34ca859370758257f0ac158cabf21974746cc50db713b10b2519a737dac WatchSource:0}: Error finding container 52eef34ca859370758257f0ac158cabf21974746cc50db713b10b2519a737dac: Status 404 returned error can't find the container with id 52eef34ca859370758257f0ac158cabf21974746cc50db713b10b2519a737dac Feb 24 03:18:06 crc kubenswrapper[4923]: I0224 03:18:06.753155 4923 generic.go:334] "Generic (PLEG): container finished" podID="c446e1e9-ba6a-40ed-81d4-9e6df663a230" containerID="8dbe533c3aebefd4fdeac2dc9bbb8e06a8ae37f776e48e0c49f8fa38fb4803a7" exitCode=0 Feb 24 03:18:06 crc kubenswrapper[4923]: I0224 03:18:06.753212 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29ckd" event={"ID":"c446e1e9-ba6a-40ed-81d4-9e6df663a230","Type":"ContainerDied","Data":"8dbe533c3aebefd4fdeac2dc9bbb8e06a8ae37f776e48e0c49f8fa38fb4803a7"} Feb 24 03:18:06 crc kubenswrapper[4923]: I0224 03:18:06.753266 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29ckd" event={"ID":"c446e1e9-ba6a-40ed-81d4-9e6df663a230","Type":"ContainerStarted","Data":"52eef34ca859370758257f0ac158cabf21974746cc50db713b10b2519a737dac"} Feb 24 03:18:08 crc kubenswrapper[4923]: I0224 03:18:08.551544 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 24 03:18:08 crc kubenswrapper[4923]: I0224 03:18:08.788090 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29ckd" event={"ID":"c446e1e9-ba6a-40ed-81d4-9e6df663a230","Type":"ContainerStarted","Data":"ecd6cc169b8ac248c85ce4dee6d885e223b16b61b18fb78c8a3ebc1145c9b547"} Feb 24 03:18:09 crc kubenswrapper[4923]: I0224 03:18:09.665246 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 24 03:18:09 crc kubenswrapper[4923]: I0224 03:18:09.798035 4923 generic.go:334] "Generic (PLEG): container finished" podID="d5eb03b7-77c3-4c05-a735-ce0c901c91cb" containerID="a28347231c9cfe7276ae84fa97eca547dafba7c98c7852b5ad4a56621ba956b5" exitCode=0 Feb 24 03:18:09 crc kubenswrapper[4923]: I0224 03:18:09.798360 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" event={"ID":"d5eb03b7-77c3-4c05-a735-ce0c901c91cb","Type":"ContainerDied","Data":"a28347231c9cfe7276ae84fa97eca547dafba7c98c7852b5ad4a56621ba956b5"} Feb 24 03:18:09 crc kubenswrapper[4923]: I0224 03:18:09.801008 4923 generic.go:334] "Generic (PLEG): container finished" podID="c446e1e9-ba6a-40ed-81d4-9e6df663a230" containerID="ecd6cc169b8ac248c85ce4dee6d885e223b16b61b18fb78c8a3ebc1145c9b547" exitCode=0 Feb 24 03:18:09 crc kubenswrapper[4923]: I0224 03:18:09.801041 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29ckd" event={"ID":"c446e1e9-ba6a-40ed-81d4-9e6df663a230","Type":"ContainerDied","Data":"ecd6cc169b8ac248c85ce4dee6d885e223b16b61b18fb78c8a3ebc1145c9b547"} Feb 24 03:18:10 crc kubenswrapper[4923]: I0224 03:18:10.811179 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29ckd" event={"ID":"c446e1e9-ba6a-40ed-81d4-9e6df663a230","Type":"ContainerStarted","Data":"3293857dbf34de7728fc586dc8311ba4cf84d842af02c19c2e71cf5af0f7c4ae"} Feb 24 03:18:10 crc kubenswrapper[4923]: I0224 03:18:10.842100 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-29ckd" podStartSLOduration=2.144582856 podStartE2EDuration="5.842079697s" podCreationTimestamp="2026-02-24 03:18:05 +0000 UTC" firstStartedPulling="2026-02-24 03:18:06.755919614 +0000 UTC m=+1410.772990427" lastFinishedPulling="2026-02-24 03:18:10.453416455 +0000 UTC m=+1414.470487268" observedRunningTime="2026-02-24 03:18:10.836765209 +0000 UTC m=+1414.853836062" watchObservedRunningTime="2026-02-24 03:18:10.842079697 +0000 UTC m=+1414.859150510" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.250582 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.302901 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-inventory\") pod \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\" (UID: \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\") " Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.303058 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpv9g\" (UniqueName: \"kubernetes.io/projected/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-kube-api-access-qpv9g\") pod \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\" (UID: \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\") " Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.303134 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-repo-setup-combined-ca-bundle\") pod \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\" (UID: \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\") " Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.303181 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-ssh-key-openstack-edpm-ipam\") pod \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\" (UID: \"d5eb03b7-77c3-4c05-a735-ce0c901c91cb\") " Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.309250 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d5eb03b7-77c3-4c05-a735-ce0c901c91cb" (UID: "d5eb03b7-77c3-4c05-a735-ce0c901c91cb"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.318005 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-kube-api-access-qpv9g" (OuterVolumeSpecName: "kube-api-access-qpv9g") pod "d5eb03b7-77c3-4c05-a735-ce0c901c91cb" (UID: "d5eb03b7-77c3-4c05-a735-ce0c901c91cb"). InnerVolumeSpecName "kube-api-access-qpv9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.343144 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-inventory" (OuterVolumeSpecName: "inventory") pod "d5eb03b7-77c3-4c05-a735-ce0c901c91cb" (UID: "d5eb03b7-77c3-4c05-a735-ce0c901c91cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.343872 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d5eb03b7-77c3-4c05-a735-ce0c901c91cb" (UID: "d5eb03b7-77c3-4c05-a735-ce0c901c91cb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.404996 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpv9g\" (UniqueName: \"kubernetes.io/projected/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-kube-api-access-qpv9g\") on node \"crc\" DevicePath \"\"" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.405156 4923 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.405211 4923 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.405266 4923 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5eb03b7-77c3-4c05-a735-ce0c901c91cb-inventory\") on node \"crc\" DevicePath \"\"" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.823512 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.823967 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6" event={"ID":"d5eb03b7-77c3-4c05-a735-ce0c901c91cb","Type":"ContainerDied","Data":"df31fb7a7a2578e6d2e641fa9cfa39dc51725a57a34fc2dd13aad17ed59dd18c"} Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.823989 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df31fb7a7a2578e6d2e641fa9cfa39dc51725a57a34fc2dd13aad17ed59dd18c" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.903860 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc"] Feb 24 03:18:11 crc kubenswrapper[4923]: E0224 03:18:11.904685 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5eb03b7-77c3-4c05-a735-ce0c901c91cb" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.904708 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5eb03b7-77c3-4c05-a735-ce0c901c91cb" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.904933 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5eb03b7-77c3-4c05-a735-ce0c901c91cb" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.905741 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.908282 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.908823 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.910442 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fgpt8" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.910939 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 24 03:18:11 crc kubenswrapper[4923]: I0224 03:18:11.932657 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc"] Feb 24 03:18:12 crc kubenswrapper[4923]: I0224 03:18:12.019692 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3816ebdb-67f1-4d77-835e-fd9323d883fd-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-r85nc\" (UID: \"3816ebdb-67f1-4d77-835e-fd9323d883fd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc" Feb 24 03:18:12 crc kubenswrapper[4923]: I0224 03:18:12.019786 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzqwc\" (UniqueName: \"kubernetes.io/projected/3816ebdb-67f1-4d77-835e-fd9323d883fd-kube-api-access-pzqwc\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-r85nc\" (UID: \"3816ebdb-67f1-4d77-835e-fd9323d883fd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc" Feb 24 03:18:12 crc kubenswrapper[4923]: I0224 03:18:12.019831 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3816ebdb-67f1-4d77-835e-fd9323d883fd-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-r85nc\" (UID: \"3816ebdb-67f1-4d77-835e-fd9323d883fd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc" Feb 24 03:18:12 crc kubenswrapper[4923]: I0224 03:18:12.121470 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzqwc\" (UniqueName: \"kubernetes.io/projected/3816ebdb-67f1-4d77-835e-fd9323d883fd-kube-api-access-pzqwc\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-r85nc\" (UID: \"3816ebdb-67f1-4d77-835e-fd9323d883fd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc" Feb 24 03:18:12 crc kubenswrapper[4923]: I0224 03:18:12.121623 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3816ebdb-67f1-4d77-835e-fd9323d883fd-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-r85nc\" (UID: \"3816ebdb-67f1-4d77-835e-fd9323d883fd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc" Feb 24 03:18:12 crc kubenswrapper[4923]: I0224 03:18:12.121724 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3816ebdb-67f1-4d77-835e-fd9323d883fd-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-r85nc\" (UID: \"3816ebdb-67f1-4d77-835e-fd9323d883fd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc" Feb 24 03:18:12 crc kubenswrapper[4923]: I0224 03:18:12.126338 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3816ebdb-67f1-4d77-835e-fd9323d883fd-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-r85nc\" (UID: \"3816ebdb-67f1-4d77-835e-fd9323d883fd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc" Feb 24 03:18:12 crc kubenswrapper[4923]: I0224 03:18:12.126586 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3816ebdb-67f1-4d77-835e-fd9323d883fd-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-r85nc\" (UID: \"3816ebdb-67f1-4d77-835e-fd9323d883fd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc" Feb 24 03:18:12 crc kubenswrapper[4923]: I0224 03:18:12.140790 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzqwc\" (UniqueName: \"kubernetes.io/projected/3816ebdb-67f1-4d77-835e-fd9323d883fd-kube-api-access-pzqwc\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-r85nc\" (UID: \"3816ebdb-67f1-4d77-835e-fd9323d883fd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc" Feb 24 03:18:12 crc kubenswrapper[4923]: I0224 03:18:12.220527 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc" Feb 24 03:18:12 crc kubenswrapper[4923]: I0224 03:18:12.820560 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc"] Feb 24 03:18:12 crc kubenswrapper[4923]: W0224 03:18:12.823743 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3816ebdb_67f1_4d77_835e_fd9323d883fd.slice/crio-5f5c1ff6512467ce6a703bc974d4b557eeef915cd890bee69a336a5c56ec6001 WatchSource:0}: Error finding container 5f5c1ff6512467ce6a703bc974d4b557eeef915cd890bee69a336a5c56ec6001: Status 404 returned error can't find the container with id 5f5c1ff6512467ce6a703bc974d4b557eeef915cd890bee69a336a5c56ec6001 Feb 24 03:18:13 crc kubenswrapper[4923]: I0224 03:18:13.843128 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc" event={"ID":"3816ebdb-67f1-4d77-835e-fd9323d883fd","Type":"ContainerStarted","Data":"5fd13d2cbaf2fc6bd4f0971cf54effa6124edf0d267f067e43601d97f620cb10"} Feb 24 03:18:13 crc kubenswrapper[4923]: I0224 03:18:13.843492 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc" event={"ID":"3816ebdb-67f1-4d77-835e-fd9323d883fd","Type":"ContainerStarted","Data":"5f5c1ff6512467ce6a703bc974d4b557eeef915cd890bee69a336a5c56ec6001"} Feb 24 03:18:13 crc kubenswrapper[4923]: I0224 03:18:13.868927 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc" podStartSLOduration=2.492149686 podStartE2EDuration="2.868910548s" podCreationTimestamp="2026-02-24 03:18:11 +0000 UTC" firstStartedPulling="2026-02-24 03:18:12.827495471 +0000 UTC m=+1416.844566284" lastFinishedPulling="2026-02-24 03:18:13.204256333 +0000 UTC m=+1417.221327146" observedRunningTime="2026-02-24 03:18:13.863001265 +0000 UTC m=+1417.880072148" watchObservedRunningTime="2026-02-24 03:18:13.868910548 +0000 UTC m=+1417.885981361" Feb 24 03:18:15 crc kubenswrapper[4923]: I0224 03:18:15.425964 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-29ckd" Feb 24 03:18:15 crc kubenswrapper[4923]: I0224 03:18:15.426450 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-29ckd" Feb 24 03:18:15 crc kubenswrapper[4923]: I0224 03:18:15.509436 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-29ckd" Feb 24 03:18:15 crc kubenswrapper[4923]: I0224 03:18:15.950414 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-29ckd" Feb 24 03:18:16 crc kubenswrapper[4923]: I0224 03:18:16.026947 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-29ckd"] Feb 24 03:18:16 crc kubenswrapper[4923]: I0224 03:18:16.877704 4923 generic.go:334] "Generic (PLEG): container finished" podID="3816ebdb-67f1-4d77-835e-fd9323d883fd" containerID="5fd13d2cbaf2fc6bd4f0971cf54effa6124edf0d267f067e43601d97f620cb10" exitCode=0 Feb 24 03:18:16 crc kubenswrapper[4923]: I0224 03:18:16.877833 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc" event={"ID":"3816ebdb-67f1-4d77-835e-fd9323d883fd","Type":"ContainerDied","Data":"5fd13d2cbaf2fc6bd4f0971cf54effa6124edf0d267f067e43601d97f620cb10"} Feb 24 03:18:17 crc kubenswrapper[4923]: I0224 03:18:17.887039 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-29ckd" podUID="c446e1e9-ba6a-40ed-81d4-9e6df663a230" containerName="registry-server" containerID="cri-o://3293857dbf34de7728fc586dc8311ba4cf84d842af02c19c2e71cf5af0f7c4ae" gracePeriod=2 Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.422580 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc" Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.433589 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29ckd" Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.463777 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3816ebdb-67f1-4d77-835e-fd9323d883fd-ssh-key-openstack-edpm-ipam\") pod \"3816ebdb-67f1-4d77-835e-fd9323d883fd\" (UID: \"3816ebdb-67f1-4d77-835e-fd9323d883fd\") " Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.463923 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3816ebdb-67f1-4d77-835e-fd9323d883fd-inventory\") pod \"3816ebdb-67f1-4d77-835e-fd9323d883fd\" (UID: \"3816ebdb-67f1-4d77-835e-fd9323d883fd\") " Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.464048 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzqwc\" (UniqueName: \"kubernetes.io/projected/3816ebdb-67f1-4d77-835e-fd9323d883fd-kube-api-access-pzqwc\") pod \"3816ebdb-67f1-4d77-835e-fd9323d883fd\" (UID: \"3816ebdb-67f1-4d77-835e-fd9323d883fd\") " Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.470864 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3816ebdb-67f1-4d77-835e-fd9323d883fd-kube-api-access-pzqwc" (OuterVolumeSpecName: "kube-api-access-pzqwc") pod "3816ebdb-67f1-4d77-835e-fd9323d883fd" (UID: "3816ebdb-67f1-4d77-835e-fd9323d883fd"). InnerVolumeSpecName "kube-api-access-pzqwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.496381 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3816ebdb-67f1-4d77-835e-fd9323d883fd-inventory" (OuterVolumeSpecName: "inventory") pod "3816ebdb-67f1-4d77-835e-fd9323d883fd" (UID: "3816ebdb-67f1-4d77-835e-fd9323d883fd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.519475 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3816ebdb-67f1-4d77-835e-fd9323d883fd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3816ebdb-67f1-4d77-835e-fd9323d883fd" (UID: "3816ebdb-67f1-4d77-835e-fd9323d883fd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.565193 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c446e1e9-ba6a-40ed-81d4-9e6df663a230-utilities\") pod \"c446e1e9-ba6a-40ed-81d4-9e6df663a230\" (UID: \"c446e1e9-ba6a-40ed-81d4-9e6df663a230\") " Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.565976 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c446e1e9-ba6a-40ed-81d4-9e6df663a230-catalog-content\") pod \"c446e1e9-ba6a-40ed-81d4-9e6df663a230\" (UID: \"c446e1e9-ba6a-40ed-81d4-9e6df663a230\") " Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.566001 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c446e1e9-ba6a-40ed-81d4-9e6df663a230-utilities" (OuterVolumeSpecName: "utilities") pod "c446e1e9-ba6a-40ed-81d4-9e6df663a230" (UID: "c446e1e9-ba6a-40ed-81d4-9e6df663a230"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.566103 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8r5t\" (UniqueName: \"kubernetes.io/projected/c446e1e9-ba6a-40ed-81d4-9e6df663a230-kube-api-access-k8r5t\") pod \"c446e1e9-ba6a-40ed-81d4-9e6df663a230\" (UID: \"c446e1e9-ba6a-40ed-81d4-9e6df663a230\") " Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.566833 4923 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3816ebdb-67f1-4d77-835e-fd9323d883fd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.566920 4923 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3816ebdb-67f1-4d77-835e-fd9323d883fd-inventory\") on node \"crc\" DevicePath \"\"" Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.566973 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzqwc\" (UniqueName: \"kubernetes.io/projected/3816ebdb-67f1-4d77-835e-fd9323d883fd-kube-api-access-pzqwc\") on node \"crc\" DevicePath \"\"" Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.567023 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c446e1e9-ba6a-40ed-81d4-9e6df663a230-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.568990 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c446e1e9-ba6a-40ed-81d4-9e6df663a230-kube-api-access-k8r5t" (OuterVolumeSpecName: "kube-api-access-k8r5t") pod "c446e1e9-ba6a-40ed-81d4-9e6df663a230" (UID: "c446e1e9-ba6a-40ed-81d4-9e6df663a230"). InnerVolumeSpecName "kube-api-access-k8r5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.614010 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c446e1e9-ba6a-40ed-81d4-9e6df663a230-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c446e1e9-ba6a-40ed-81d4-9e6df663a230" (UID: "c446e1e9-ba6a-40ed-81d4-9e6df663a230"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.668949 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8r5t\" (UniqueName: \"kubernetes.io/projected/c446e1e9-ba6a-40ed-81d4-9e6df663a230-kube-api-access-k8r5t\") on node \"crc\" DevicePath \"\"" Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.668998 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c446e1e9-ba6a-40ed-81d4-9e6df663a230-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.900202 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc" event={"ID":"3816ebdb-67f1-4d77-835e-fd9323d883fd","Type":"ContainerDied","Data":"5f5c1ff6512467ce6a703bc974d4b557eeef915cd890bee69a336a5c56ec6001"} Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.900271 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f5c1ff6512467ce6a703bc974d4b557eeef915cd890bee69a336a5c56ec6001" Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.900227 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-r85nc" Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.904567 4923 generic.go:334] "Generic (PLEG): container finished" podID="c446e1e9-ba6a-40ed-81d4-9e6df663a230" containerID="3293857dbf34de7728fc586dc8311ba4cf84d842af02c19c2e71cf5af0f7c4ae" exitCode=0 Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.904617 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29ckd" event={"ID":"c446e1e9-ba6a-40ed-81d4-9e6df663a230","Type":"ContainerDied","Data":"3293857dbf34de7728fc586dc8311ba4cf84d842af02c19c2e71cf5af0f7c4ae"} Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.904653 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29ckd" event={"ID":"c446e1e9-ba6a-40ed-81d4-9e6df663a230","Type":"ContainerDied","Data":"52eef34ca859370758257f0ac158cabf21974746cc50db713b10b2519a737dac"} Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.904681 4923 scope.go:117] "RemoveContainer" containerID="3293857dbf34de7728fc586dc8311ba4cf84d842af02c19c2e71cf5af0f7c4ae" Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.904661 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29ckd" Feb 24 03:18:18 crc kubenswrapper[4923]: I0224 03:18:18.970954 4923 scope.go:117] "RemoveContainer" containerID="ecd6cc169b8ac248c85ce4dee6d885e223b16b61b18fb78c8a3ebc1145c9b547" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:18.999089 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-29ckd"] Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.012974 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-29ckd"] Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.018034 4923 scope.go:117] "RemoveContainer" containerID="8dbe533c3aebefd4fdeac2dc9bbb8e06a8ae37f776e48e0c49f8fa38fb4803a7" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.025908 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb"] Feb 24 03:18:19 crc kubenswrapper[4923]: E0224 03:18:19.026331 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3816ebdb-67f1-4d77-835e-fd9323d883fd" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.026393 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="3816ebdb-67f1-4d77-835e-fd9323d883fd" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 24 03:18:19 crc kubenswrapper[4923]: E0224 03:18:19.026444 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c446e1e9-ba6a-40ed-81d4-9e6df663a230" containerName="registry-server" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.026507 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="c446e1e9-ba6a-40ed-81d4-9e6df663a230" containerName="registry-server" Feb 24 03:18:19 crc kubenswrapper[4923]: E0224 03:18:19.026587 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c446e1e9-ba6a-40ed-81d4-9e6df663a230" containerName="extract-content" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.026640 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="c446e1e9-ba6a-40ed-81d4-9e6df663a230" containerName="extract-content" Feb 24 03:18:19 crc kubenswrapper[4923]: E0224 03:18:19.026699 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c446e1e9-ba6a-40ed-81d4-9e6df663a230" containerName="extract-utilities" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.026848 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="c446e1e9-ba6a-40ed-81d4-9e6df663a230" containerName="extract-utilities" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.027066 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="c446e1e9-ba6a-40ed-81d4-9e6df663a230" containerName="registry-server" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.027125 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="3816ebdb-67f1-4d77-835e-fd9323d883fd" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.027756 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.033498 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.033538 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.033577 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fgpt8" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.033870 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.036617 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb"] Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.082012 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb\" (UID: \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.082068 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb\" (UID: \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.082184 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb\" (UID: \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.082230 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkkmn\" (UniqueName: \"kubernetes.io/projected/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-kube-api-access-wkkmn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb\" (UID: \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.110993 4923 scope.go:117] "RemoveContainer" containerID="3293857dbf34de7728fc586dc8311ba4cf84d842af02c19c2e71cf5af0f7c4ae" Feb 24 03:18:19 crc kubenswrapper[4923]: E0224 03:18:19.111519 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3293857dbf34de7728fc586dc8311ba4cf84d842af02c19c2e71cf5af0f7c4ae\": container with ID starting with 3293857dbf34de7728fc586dc8311ba4cf84d842af02c19c2e71cf5af0f7c4ae not found: ID does not exist" containerID="3293857dbf34de7728fc586dc8311ba4cf84d842af02c19c2e71cf5af0f7c4ae" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.111553 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3293857dbf34de7728fc586dc8311ba4cf84d842af02c19c2e71cf5af0f7c4ae"} err="failed to get container status \"3293857dbf34de7728fc586dc8311ba4cf84d842af02c19c2e71cf5af0f7c4ae\": rpc error: code = NotFound desc = could not find container \"3293857dbf34de7728fc586dc8311ba4cf84d842af02c19c2e71cf5af0f7c4ae\": container with ID starting with 3293857dbf34de7728fc586dc8311ba4cf84d842af02c19c2e71cf5af0f7c4ae not found: ID does not exist" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.111574 4923 scope.go:117] "RemoveContainer" containerID="ecd6cc169b8ac248c85ce4dee6d885e223b16b61b18fb78c8a3ebc1145c9b547" Feb 24 03:18:19 crc kubenswrapper[4923]: E0224 03:18:19.112009 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecd6cc169b8ac248c85ce4dee6d885e223b16b61b18fb78c8a3ebc1145c9b547\": container with ID starting with ecd6cc169b8ac248c85ce4dee6d885e223b16b61b18fb78c8a3ebc1145c9b547 not found: ID does not exist" containerID="ecd6cc169b8ac248c85ce4dee6d885e223b16b61b18fb78c8a3ebc1145c9b547" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.112034 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecd6cc169b8ac248c85ce4dee6d885e223b16b61b18fb78c8a3ebc1145c9b547"} err="failed to get container status \"ecd6cc169b8ac248c85ce4dee6d885e223b16b61b18fb78c8a3ebc1145c9b547\": rpc error: code = NotFound desc = could not find container \"ecd6cc169b8ac248c85ce4dee6d885e223b16b61b18fb78c8a3ebc1145c9b547\": container with ID starting with ecd6cc169b8ac248c85ce4dee6d885e223b16b61b18fb78c8a3ebc1145c9b547 not found: ID does not exist" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.112049 4923 scope.go:117] "RemoveContainer" containerID="8dbe533c3aebefd4fdeac2dc9bbb8e06a8ae37f776e48e0c49f8fa38fb4803a7" Feb 24 03:18:19 crc kubenswrapper[4923]: E0224 03:18:19.112252 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dbe533c3aebefd4fdeac2dc9bbb8e06a8ae37f776e48e0c49f8fa38fb4803a7\": container with ID starting with 8dbe533c3aebefd4fdeac2dc9bbb8e06a8ae37f776e48e0c49f8fa38fb4803a7 not found: ID does not exist" containerID="8dbe533c3aebefd4fdeac2dc9bbb8e06a8ae37f776e48e0c49f8fa38fb4803a7" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.112288 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dbe533c3aebefd4fdeac2dc9bbb8e06a8ae37f776e48e0c49f8fa38fb4803a7"} err="failed to get container status \"8dbe533c3aebefd4fdeac2dc9bbb8e06a8ae37f776e48e0c49f8fa38fb4803a7\": rpc error: code = NotFound desc = could not find container \"8dbe533c3aebefd4fdeac2dc9bbb8e06a8ae37f776e48e0c49f8fa38fb4803a7\": container with ID starting with 8dbe533c3aebefd4fdeac2dc9bbb8e06a8ae37f776e48e0c49f8fa38fb4803a7 not found: ID does not exist" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.184584 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb\" (UID: \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.184636 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb\" (UID: \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.184715 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb\" (UID: \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.184753 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkkmn\" (UniqueName: \"kubernetes.io/projected/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-kube-api-access-wkkmn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb\" (UID: \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.190547 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb\" (UID: \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.190915 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb\" (UID: \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.191484 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb\" (UID: \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.201111 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkkmn\" (UniqueName: \"kubernetes.io/projected/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-kube-api-access-wkkmn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb\" (UID: \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.437828 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" Feb 24 03:18:19 crc kubenswrapper[4923]: I0224 03:18:19.729135 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c446e1e9-ba6a-40ed-81d4-9e6df663a230" path="/var/lib/kubelet/pods/c446e1e9-ba6a-40ed-81d4-9e6df663a230/volumes" Feb 24 03:18:20 crc kubenswrapper[4923]: I0224 03:18:20.181045 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb"] Feb 24 03:18:20 crc kubenswrapper[4923]: W0224 03:18:20.186219 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd6fe20f_e2e1_46ae_aa88_cbfd410076a2.slice/crio-56278498ea59c26475526f1a25534d264da187eb79c9f5364374db96baaeb518 WatchSource:0}: Error finding container 56278498ea59c26475526f1a25534d264da187eb79c9f5364374db96baaeb518: Status 404 returned error can't find the container with id 56278498ea59c26475526f1a25534d264da187eb79c9f5364374db96baaeb518 Feb 24 03:18:20 crc kubenswrapper[4923]: I0224 03:18:20.925603 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" event={"ID":"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2","Type":"ContainerStarted","Data":"d197334fcd5f9d118f387d06681634ccd944159e6eee24ea5e43214a71b26f10"} Feb 24 03:18:20 crc kubenswrapper[4923]: I0224 03:18:20.925642 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" event={"ID":"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2","Type":"ContainerStarted","Data":"56278498ea59c26475526f1a25534d264da187eb79c9f5364374db96baaeb518"} Feb 24 03:18:20 crc kubenswrapper[4923]: I0224 03:18:20.943257 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" podStartSLOduration=2.5394279920000002 podStartE2EDuration="2.943241688s" podCreationTimestamp="2026-02-24 03:18:18 +0000 UTC" firstStartedPulling="2026-02-24 03:18:20.18923286 +0000 UTC m=+1424.206303673" lastFinishedPulling="2026-02-24 03:18:20.593046546 +0000 UTC m=+1424.610117369" observedRunningTime="2026-02-24 03:18:20.938650078 +0000 UTC m=+1424.955720891" watchObservedRunningTime="2026-02-24 03:18:20.943241688 +0000 UTC m=+1424.960312501" Feb 24 03:18:41 crc kubenswrapper[4923]: I0224 03:18:41.499963 4923 scope.go:117] "RemoveContainer" containerID="0a8b05dd27413b3898cfa9da0c0a9d1b63d976cbb447edb1c4ef7b5b64ddcf15" Feb 24 03:18:41 crc kubenswrapper[4923]: I0224 03:18:41.561671 4923 scope.go:117] "RemoveContainer" containerID="ff329fbe5e8fe22510fc5fa1adc20cf30514354075659b6184cd149442139275" Feb 24 03:18:41 crc kubenswrapper[4923]: I0224 03:18:41.620188 4923 scope.go:117] "RemoveContainer" containerID="402bfedafb99a5b76acdee3ff430f6d1b0dc0dcd5ce0bef31c9f6a5f1492efe0" Feb 24 03:18:41 crc kubenswrapper[4923]: I0224 03:18:41.650742 4923 scope.go:117] "RemoveContainer" containerID="f214badca1940336d886b38fbd14ffa06f3f3192e9d3d1cef2ec79f7bef5c6b0" Feb 24 03:18:41 crc kubenswrapper[4923]: I0224 03:18:41.704835 4923 scope.go:117] "RemoveContainer" containerID="780d10fe0138b329c1369b0f1cbd1e1e5c8fbef05d9caf5651b24f4d82a4f4d2" Feb 24 03:18:41 crc kubenswrapper[4923]: I0224 03:18:41.727747 4923 scope.go:117] "RemoveContainer" containerID="fea3d74be017adec0f287e658df385cb96dac19e272b623b9689db1c0588a683" Feb 24 03:18:41 crc kubenswrapper[4923]: I0224 03:18:41.791024 4923 scope.go:117] "RemoveContainer" containerID="1501ce8e6dcd79ed916a4ecd4fff77e72e3339ab4d79052e4f0610f226edeca9" Feb 24 03:18:55 crc kubenswrapper[4923]: I0224 03:18:55.065769 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dzlb7"] Feb 24 03:18:55 crc kubenswrapper[4923]: I0224 03:18:55.075183 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzlb7" Feb 24 03:18:55 crc kubenswrapper[4923]: I0224 03:18:55.087720 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dzlb7"] Feb 24 03:18:55 crc kubenswrapper[4923]: I0224 03:18:55.170688 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3ef13ed-22d6-4df6-9318-f4e8be703a58-catalog-content\") pod \"certified-operators-dzlb7\" (UID: \"b3ef13ed-22d6-4df6-9318-f4e8be703a58\") " pod="openshift-marketplace/certified-operators-dzlb7" Feb 24 03:18:55 crc kubenswrapper[4923]: I0224 03:18:55.170805 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3ef13ed-22d6-4df6-9318-f4e8be703a58-utilities\") pod \"certified-operators-dzlb7\" (UID: \"b3ef13ed-22d6-4df6-9318-f4e8be703a58\") " pod="openshift-marketplace/certified-operators-dzlb7" Feb 24 03:18:55 crc kubenswrapper[4923]: I0224 03:18:55.170976 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnmch\" (UniqueName: \"kubernetes.io/projected/b3ef13ed-22d6-4df6-9318-f4e8be703a58-kube-api-access-wnmch\") pod \"certified-operators-dzlb7\" (UID: \"b3ef13ed-22d6-4df6-9318-f4e8be703a58\") " pod="openshift-marketplace/certified-operators-dzlb7" Feb 24 03:18:55 crc kubenswrapper[4923]: I0224 03:18:55.273130 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnmch\" (UniqueName: \"kubernetes.io/projected/b3ef13ed-22d6-4df6-9318-f4e8be703a58-kube-api-access-wnmch\") pod \"certified-operators-dzlb7\" (UID: \"b3ef13ed-22d6-4df6-9318-f4e8be703a58\") " pod="openshift-marketplace/certified-operators-dzlb7" Feb 24 03:18:55 crc kubenswrapper[4923]: I0224 03:18:55.273201 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3ef13ed-22d6-4df6-9318-f4e8be703a58-catalog-content\") pod \"certified-operators-dzlb7\" (UID: \"b3ef13ed-22d6-4df6-9318-f4e8be703a58\") " pod="openshift-marketplace/certified-operators-dzlb7" Feb 24 03:18:55 crc kubenswrapper[4923]: I0224 03:18:55.273276 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3ef13ed-22d6-4df6-9318-f4e8be703a58-utilities\") pod \"certified-operators-dzlb7\" (UID: \"b3ef13ed-22d6-4df6-9318-f4e8be703a58\") " pod="openshift-marketplace/certified-operators-dzlb7" Feb 24 03:18:55 crc kubenswrapper[4923]: I0224 03:18:55.273804 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3ef13ed-22d6-4df6-9318-f4e8be703a58-catalog-content\") pod \"certified-operators-dzlb7\" (UID: \"b3ef13ed-22d6-4df6-9318-f4e8be703a58\") " pod="openshift-marketplace/certified-operators-dzlb7" Feb 24 03:18:55 crc kubenswrapper[4923]: I0224 03:18:55.273841 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3ef13ed-22d6-4df6-9318-f4e8be703a58-utilities\") pod \"certified-operators-dzlb7\" (UID: \"b3ef13ed-22d6-4df6-9318-f4e8be703a58\") " pod="openshift-marketplace/certified-operators-dzlb7" Feb 24 03:18:55 crc kubenswrapper[4923]: I0224 03:18:55.299158 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnmch\" (UniqueName: \"kubernetes.io/projected/b3ef13ed-22d6-4df6-9318-f4e8be703a58-kube-api-access-wnmch\") pod \"certified-operators-dzlb7\" (UID: \"b3ef13ed-22d6-4df6-9318-f4e8be703a58\") " pod="openshift-marketplace/certified-operators-dzlb7" Feb 24 03:18:55 crc kubenswrapper[4923]: I0224 03:18:55.407256 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzlb7" Feb 24 03:18:55 crc kubenswrapper[4923]: I0224 03:18:55.907054 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dzlb7"] Feb 24 03:18:56 crc kubenswrapper[4923]: I0224 03:18:56.314198 4923 generic.go:334] "Generic (PLEG): container finished" podID="b3ef13ed-22d6-4df6-9318-f4e8be703a58" containerID="a6fbed4d74ed19b1b0556b45cb5702c5108db723a17f7746b27312c18b7a6b1b" exitCode=0 Feb 24 03:18:56 crc kubenswrapper[4923]: I0224 03:18:56.314233 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzlb7" event={"ID":"b3ef13ed-22d6-4df6-9318-f4e8be703a58","Type":"ContainerDied","Data":"a6fbed4d74ed19b1b0556b45cb5702c5108db723a17f7746b27312c18b7a6b1b"} Feb 24 03:18:56 crc kubenswrapper[4923]: I0224 03:18:56.314257 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzlb7" event={"ID":"b3ef13ed-22d6-4df6-9318-f4e8be703a58","Type":"ContainerStarted","Data":"df3688272a848c91219b9c27bb3b983dc4a87a3806da9ad750e280f1cbca8f95"} Feb 24 03:18:57 crc kubenswrapper[4923]: I0224 03:18:57.323440 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzlb7" event={"ID":"b3ef13ed-22d6-4df6-9318-f4e8be703a58","Type":"ContainerStarted","Data":"2078852c9b743ebd645a20a58dae260136f23f3b90a3165585bbd544f6bf6954"} Feb 24 03:18:58 crc kubenswrapper[4923]: I0224 03:18:58.339045 4923 generic.go:334] "Generic (PLEG): container finished" podID="b3ef13ed-22d6-4df6-9318-f4e8be703a58" containerID="2078852c9b743ebd645a20a58dae260136f23f3b90a3165585bbd544f6bf6954" exitCode=0 Feb 24 03:18:58 crc kubenswrapper[4923]: I0224 03:18:58.339122 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzlb7" event={"ID":"b3ef13ed-22d6-4df6-9318-f4e8be703a58","Type":"ContainerDied","Data":"2078852c9b743ebd645a20a58dae260136f23f3b90a3165585bbd544f6bf6954"} Feb 24 03:18:59 crc kubenswrapper[4923]: I0224 03:18:59.351525 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzlb7" event={"ID":"b3ef13ed-22d6-4df6-9318-f4e8be703a58","Type":"ContainerStarted","Data":"0ab299dec228a86833445e53734e2a8a9d260689b0fbd1cf20b283d3a3b9bca2"} Feb 24 03:19:05 crc kubenswrapper[4923]: I0224 03:19:05.408314 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dzlb7" Feb 24 03:19:05 crc kubenswrapper[4923]: I0224 03:19:05.408842 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dzlb7" Feb 24 03:19:05 crc kubenswrapper[4923]: I0224 03:19:05.463562 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dzlb7" Feb 24 03:19:05 crc kubenswrapper[4923]: I0224 03:19:05.505540 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dzlb7" podStartSLOduration=8.101795914 podStartE2EDuration="10.505508469s" podCreationTimestamp="2026-02-24 03:18:55 +0000 UTC" firstStartedPulling="2026-02-24 03:18:56.31701173 +0000 UTC m=+1460.334082563" lastFinishedPulling="2026-02-24 03:18:58.720724305 +0000 UTC m=+1462.737795118" observedRunningTime="2026-02-24 03:18:59.374287022 +0000 UTC m=+1463.391357865" watchObservedRunningTime="2026-02-24 03:19:05.505508469 +0000 UTC m=+1469.522579292" Feb 24 03:19:05 crc kubenswrapper[4923]: I0224 03:19:05.514557 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dzlb7" Feb 24 03:19:05 crc kubenswrapper[4923]: I0224 03:19:05.703229 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dzlb7"] Feb 24 03:19:07 crc kubenswrapper[4923]: I0224 03:19:07.437407 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dzlb7" podUID="b3ef13ed-22d6-4df6-9318-f4e8be703a58" containerName="registry-server" containerID="cri-o://0ab299dec228a86833445e53734e2a8a9d260689b0fbd1cf20b283d3a3b9bca2" gracePeriod=2 Feb 24 03:19:07 crc kubenswrapper[4923]: I0224 03:19:07.884788 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzlb7" Feb 24 03:19:07 crc kubenswrapper[4923]: I0224 03:19:07.923419 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnmch\" (UniqueName: \"kubernetes.io/projected/b3ef13ed-22d6-4df6-9318-f4e8be703a58-kube-api-access-wnmch\") pod \"b3ef13ed-22d6-4df6-9318-f4e8be703a58\" (UID: \"b3ef13ed-22d6-4df6-9318-f4e8be703a58\") " Feb 24 03:19:07 crc kubenswrapper[4923]: I0224 03:19:07.923578 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3ef13ed-22d6-4df6-9318-f4e8be703a58-catalog-content\") pod \"b3ef13ed-22d6-4df6-9318-f4e8be703a58\" (UID: \"b3ef13ed-22d6-4df6-9318-f4e8be703a58\") " Feb 24 03:19:07 crc kubenswrapper[4923]: I0224 03:19:07.924679 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3ef13ed-22d6-4df6-9318-f4e8be703a58-utilities" (OuterVolumeSpecName: "utilities") pod "b3ef13ed-22d6-4df6-9318-f4e8be703a58" (UID: "b3ef13ed-22d6-4df6-9318-f4e8be703a58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:19:07 crc kubenswrapper[4923]: I0224 03:19:07.937154 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3ef13ed-22d6-4df6-9318-f4e8be703a58-utilities\") pod \"b3ef13ed-22d6-4df6-9318-f4e8be703a58\" (UID: \"b3ef13ed-22d6-4df6-9318-f4e8be703a58\") " Feb 24 03:19:07 crc kubenswrapper[4923]: I0224 03:19:07.938135 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3ef13ed-22d6-4df6-9318-f4e8be703a58-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:19:07 crc kubenswrapper[4923]: I0224 03:19:07.957967 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3ef13ed-22d6-4df6-9318-f4e8be703a58-kube-api-access-wnmch" (OuterVolumeSpecName: "kube-api-access-wnmch") pod "b3ef13ed-22d6-4df6-9318-f4e8be703a58" (UID: "b3ef13ed-22d6-4df6-9318-f4e8be703a58"). InnerVolumeSpecName "kube-api-access-wnmch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:19:08 crc kubenswrapper[4923]: I0224 03:19:08.039861 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnmch\" (UniqueName: \"kubernetes.io/projected/b3ef13ed-22d6-4df6-9318-f4e8be703a58-kube-api-access-wnmch\") on node \"crc\" DevicePath \"\"" Feb 24 03:19:08 crc kubenswrapper[4923]: I0224 03:19:08.449235 4923 generic.go:334] "Generic (PLEG): container finished" podID="b3ef13ed-22d6-4df6-9318-f4e8be703a58" containerID="0ab299dec228a86833445e53734e2a8a9d260689b0fbd1cf20b283d3a3b9bca2" exitCode=0 Feb 24 03:19:08 crc kubenswrapper[4923]: I0224 03:19:08.449319 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dzlb7" Feb 24 03:19:08 crc kubenswrapper[4923]: I0224 03:19:08.449341 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzlb7" event={"ID":"b3ef13ed-22d6-4df6-9318-f4e8be703a58","Type":"ContainerDied","Data":"0ab299dec228a86833445e53734e2a8a9d260689b0fbd1cf20b283d3a3b9bca2"} Feb 24 03:19:08 crc kubenswrapper[4923]: I0224 03:19:08.449769 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dzlb7" event={"ID":"b3ef13ed-22d6-4df6-9318-f4e8be703a58","Type":"ContainerDied","Data":"df3688272a848c91219b9c27bb3b983dc4a87a3806da9ad750e280f1cbca8f95"} Feb 24 03:19:08 crc kubenswrapper[4923]: I0224 03:19:08.449798 4923 scope.go:117] "RemoveContainer" containerID="0ab299dec228a86833445e53734e2a8a9d260689b0fbd1cf20b283d3a3b9bca2" Feb 24 03:19:08 crc kubenswrapper[4923]: I0224 03:19:08.468641 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3ef13ed-22d6-4df6-9318-f4e8be703a58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3ef13ed-22d6-4df6-9318-f4e8be703a58" (UID: "b3ef13ed-22d6-4df6-9318-f4e8be703a58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:19:08 crc kubenswrapper[4923]: I0224 03:19:08.469702 4923 scope.go:117] "RemoveContainer" containerID="2078852c9b743ebd645a20a58dae260136f23f3b90a3165585bbd544f6bf6954" Feb 24 03:19:08 crc kubenswrapper[4923]: I0224 03:19:08.513319 4923 scope.go:117] "RemoveContainer" containerID="a6fbed4d74ed19b1b0556b45cb5702c5108db723a17f7746b27312c18b7a6b1b" Feb 24 03:19:08 crc kubenswrapper[4923]: I0224 03:19:08.536228 4923 scope.go:117] "RemoveContainer" containerID="0ab299dec228a86833445e53734e2a8a9d260689b0fbd1cf20b283d3a3b9bca2" Feb 24 03:19:08 crc kubenswrapper[4923]: E0224 03:19:08.536491 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ab299dec228a86833445e53734e2a8a9d260689b0fbd1cf20b283d3a3b9bca2\": container with ID starting with 0ab299dec228a86833445e53734e2a8a9d260689b0fbd1cf20b283d3a3b9bca2 not found: ID does not exist" containerID="0ab299dec228a86833445e53734e2a8a9d260689b0fbd1cf20b283d3a3b9bca2" Feb 24 03:19:08 crc kubenswrapper[4923]: I0224 03:19:08.536525 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab299dec228a86833445e53734e2a8a9d260689b0fbd1cf20b283d3a3b9bca2"} err="failed to get container status \"0ab299dec228a86833445e53734e2a8a9d260689b0fbd1cf20b283d3a3b9bca2\": rpc error: code = NotFound desc = could not find container \"0ab299dec228a86833445e53734e2a8a9d260689b0fbd1cf20b283d3a3b9bca2\": container with ID starting with 0ab299dec228a86833445e53734e2a8a9d260689b0fbd1cf20b283d3a3b9bca2 not found: ID does not exist" Feb 24 03:19:08 crc kubenswrapper[4923]: I0224 03:19:08.536545 4923 scope.go:117] "RemoveContainer" containerID="2078852c9b743ebd645a20a58dae260136f23f3b90a3165585bbd544f6bf6954" Feb 24 03:19:08 crc kubenswrapper[4923]: E0224 03:19:08.537011 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2078852c9b743ebd645a20a58dae260136f23f3b90a3165585bbd544f6bf6954\": container with ID starting with 2078852c9b743ebd645a20a58dae260136f23f3b90a3165585bbd544f6bf6954 not found: ID does not exist" containerID="2078852c9b743ebd645a20a58dae260136f23f3b90a3165585bbd544f6bf6954" Feb 24 03:19:08 crc kubenswrapper[4923]: I0224 03:19:08.537042 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2078852c9b743ebd645a20a58dae260136f23f3b90a3165585bbd544f6bf6954"} err="failed to get container status \"2078852c9b743ebd645a20a58dae260136f23f3b90a3165585bbd544f6bf6954\": rpc error: code = NotFound desc = could not find container \"2078852c9b743ebd645a20a58dae260136f23f3b90a3165585bbd544f6bf6954\": container with ID starting with 2078852c9b743ebd645a20a58dae260136f23f3b90a3165585bbd544f6bf6954 not found: ID does not exist" Feb 24 03:19:08 crc kubenswrapper[4923]: I0224 03:19:08.537061 4923 scope.go:117] "RemoveContainer" containerID="a6fbed4d74ed19b1b0556b45cb5702c5108db723a17f7746b27312c18b7a6b1b" Feb 24 03:19:08 crc kubenswrapper[4923]: E0224 03:19:08.537418 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6fbed4d74ed19b1b0556b45cb5702c5108db723a17f7746b27312c18b7a6b1b\": container with ID starting with a6fbed4d74ed19b1b0556b45cb5702c5108db723a17f7746b27312c18b7a6b1b not found: ID does not exist" containerID="a6fbed4d74ed19b1b0556b45cb5702c5108db723a17f7746b27312c18b7a6b1b" Feb 24 03:19:08 crc kubenswrapper[4923]: I0224 03:19:08.537451 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6fbed4d74ed19b1b0556b45cb5702c5108db723a17f7746b27312c18b7a6b1b"} err="failed to get container status \"a6fbed4d74ed19b1b0556b45cb5702c5108db723a17f7746b27312c18b7a6b1b\": rpc error: code = NotFound desc = could not find container \"a6fbed4d74ed19b1b0556b45cb5702c5108db723a17f7746b27312c18b7a6b1b\": container with ID starting with a6fbed4d74ed19b1b0556b45cb5702c5108db723a17f7746b27312c18b7a6b1b not found: ID does not exist" Feb 24 03:19:08 crc kubenswrapper[4923]: I0224 03:19:08.549495 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3ef13ed-22d6-4df6-9318-f4e8be703a58-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:19:08 crc kubenswrapper[4923]: I0224 03:19:08.785690 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dzlb7"] Feb 24 03:19:08 crc kubenswrapper[4923]: I0224 03:19:08.793367 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dzlb7"] Feb 24 03:19:09 crc kubenswrapper[4923]: I0224 03:19:09.726775 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3ef13ed-22d6-4df6-9318-f4e8be703a58" path="/var/lib/kubelet/pods/b3ef13ed-22d6-4df6-9318-f4e8be703a58/volumes" Feb 24 03:19:41 crc kubenswrapper[4923]: I0224 03:19:41.938270 4923 scope.go:117] "RemoveContainer" containerID="5a85c6672f1a475a0cbaf700fa58bad085c139f5bb8fd40c166325f0e40fa6b0" Feb 24 03:19:49 crc kubenswrapper[4923]: I0224 03:19:49.916273 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:19:49 crc kubenswrapper[4923]: I0224 03:19:49.917202 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:20:19 crc kubenswrapper[4923]: I0224 03:20:19.916941 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:20:19 crc kubenswrapper[4923]: I0224 03:20:19.917531 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:20:42 crc kubenswrapper[4923]: I0224 03:20:42.023061 4923 scope.go:117] "RemoveContainer" containerID="9aa9a41987a9cbc2a537716b3c3c740914d8cdb2d920f66f66248b16d5c304cc" Feb 24 03:20:42 crc kubenswrapper[4923]: I0224 03:20:42.058855 4923 scope.go:117] "RemoveContainer" containerID="7a139f047a5ae84a7fc57e83cc3565a1ca4c169a2426e55da5e73a4d29cd4dc1" Feb 24 03:20:42 crc kubenswrapper[4923]: I0224 03:20:42.080370 4923 scope.go:117] "RemoveContainer" containerID="7ed4bef870cd5591742472697f31998e6d865721af6dd8c097c21e7145f19158" Feb 24 03:20:42 crc kubenswrapper[4923]: I0224 03:20:42.104981 4923 scope.go:117] "RemoveContainer" containerID="4f86b155c5991fc15d0653acee0a90987cb67855c3ee8d6fe214884b19dd7d58" Feb 24 03:20:49 crc kubenswrapper[4923]: I0224 03:20:49.916879 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:20:49 crc kubenswrapper[4923]: I0224 03:20:49.917731 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:20:49 crc kubenswrapper[4923]: I0224 03:20:49.917796 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 03:20:49 crc kubenswrapper[4923]: I0224 03:20:49.918881 4923 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0faeb363e0b14f83a047ef04f8fa2df18f1991b14418890ba609de06ecd5c251"} pod="openshift-machine-config-operator/machine-config-daemon-rh26t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 03:20:49 crc kubenswrapper[4923]: I0224 03:20:49.918983 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" containerID="cri-o://0faeb363e0b14f83a047ef04f8fa2df18f1991b14418890ba609de06ecd5c251" gracePeriod=600 Feb 24 03:20:50 crc kubenswrapper[4923]: I0224 03:20:50.743846 4923 generic.go:334] "Generic (PLEG): container finished" podID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerID="0faeb363e0b14f83a047ef04f8fa2df18f1991b14418890ba609de06ecd5c251" exitCode=0 Feb 24 03:20:50 crc kubenswrapper[4923]: I0224 03:20:50.743904 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerDied","Data":"0faeb363e0b14f83a047ef04f8fa2df18f1991b14418890ba609de06ecd5c251"} Feb 24 03:20:50 crc kubenswrapper[4923]: I0224 03:20:50.744819 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerStarted","Data":"e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6"} Feb 24 03:20:50 crc kubenswrapper[4923]: I0224 03:20:50.744874 4923 scope.go:117] "RemoveContainer" containerID="84006aadd17b2e131a632622b49eac940374eaac532afbb7829f93e09553d367" Feb 24 03:21:18 crc kubenswrapper[4923]: I0224 03:21:18.041879 4923 generic.go:334] "Generic (PLEG): container finished" podID="dd6fe20f-e2e1-46ae-aa88-cbfd410076a2" containerID="d197334fcd5f9d118f387d06681634ccd944159e6eee24ea5e43214a71b26f10" exitCode=0 Feb 24 03:21:18 crc kubenswrapper[4923]: I0224 03:21:18.042010 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" event={"ID":"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2","Type":"ContainerDied","Data":"d197334fcd5f9d118f387d06681634ccd944159e6eee24ea5e43214a71b26f10"} Feb 24 03:21:19 crc kubenswrapper[4923]: I0224 03:21:19.471636 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" Feb 24 03:21:19 crc kubenswrapper[4923]: I0224 03:21:19.596470 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-ssh-key-openstack-edpm-ipam\") pod \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\" (UID: \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\") " Feb 24 03:21:19 crc kubenswrapper[4923]: I0224 03:21:19.596618 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-inventory\") pod \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\" (UID: \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\") " Feb 24 03:21:19 crc kubenswrapper[4923]: I0224 03:21:19.596895 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkkmn\" (UniqueName: \"kubernetes.io/projected/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-kube-api-access-wkkmn\") pod \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\" (UID: \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\") " Feb 24 03:21:19 crc kubenswrapper[4923]: I0224 03:21:19.597001 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-bootstrap-combined-ca-bundle\") pod \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\" (UID: \"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2\") " Feb 24 03:21:19 crc kubenswrapper[4923]: I0224 03:21:19.607644 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "dd6fe20f-e2e1-46ae-aa88-cbfd410076a2" (UID: "dd6fe20f-e2e1-46ae-aa88-cbfd410076a2"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:21:19 crc kubenswrapper[4923]: I0224 03:21:19.614802 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-kube-api-access-wkkmn" (OuterVolumeSpecName: "kube-api-access-wkkmn") pod "dd6fe20f-e2e1-46ae-aa88-cbfd410076a2" (UID: "dd6fe20f-e2e1-46ae-aa88-cbfd410076a2"). InnerVolumeSpecName "kube-api-access-wkkmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:21:19 crc kubenswrapper[4923]: I0224 03:21:19.646473 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dd6fe20f-e2e1-46ae-aa88-cbfd410076a2" (UID: "dd6fe20f-e2e1-46ae-aa88-cbfd410076a2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:21:19 crc kubenswrapper[4923]: I0224 03:21:19.650595 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-inventory" (OuterVolumeSpecName: "inventory") pod "dd6fe20f-e2e1-46ae-aa88-cbfd410076a2" (UID: "dd6fe20f-e2e1-46ae-aa88-cbfd410076a2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:21:19 crc kubenswrapper[4923]: I0224 03:21:19.700438 4923 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:21:19 crc kubenswrapper[4923]: I0224 03:21:19.700491 4923 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 24 03:21:19 crc kubenswrapper[4923]: I0224 03:21:19.700509 4923 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-inventory\") on node \"crc\" DevicePath \"\"" Feb 24 03:21:19 crc kubenswrapper[4923]: I0224 03:21:19.700525 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkkmn\" (UniqueName: \"kubernetes.io/projected/dd6fe20f-e2e1-46ae-aa88-cbfd410076a2-kube-api-access-wkkmn\") on node \"crc\" DevicePath \"\"" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.063954 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" event={"ID":"dd6fe20f-e2e1-46ae-aa88-cbfd410076a2","Type":"ContainerDied","Data":"56278498ea59c26475526f1a25534d264da187eb79c9f5364374db96baaeb518"} Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.064064 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56278498ea59c26475526f1a25534d264da187eb79c9f5364374db96baaeb518" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.064006 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.182492 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw"] Feb 24 03:21:20 crc kubenswrapper[4923]: E0224 03:21:20.183050 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ef13ed-22d6-4df6-9318-f4e8be703a58" containerName="registry-server" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.183075 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ef13ed-22d6-4df6-9318-f4e8be703a58" containerName="registry-server" Feb 24 03:21:20 crc kubenswrapper[4923]: E0224 03:21:20.183132 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ef13ed-22d6-4df6-9318-f4e8be703a58" containerName="extract-utilities" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.183141 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ef13ed-22d6-4df6-9318-f4e8be703a58" containerName="extract-utilities" Feb 24 03:21:20 crc kubenswrapper[4923]: E0224 03:21:20.183162 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6fe20f-e2e1-46ae-aa88-cbfd410076a2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.183172 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6fe20f-e2e1-46ae-aa88-cbfd410076a2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 24 03:21:20 crc kubenswrapper[4923]: E0224 03:21:20.183186 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ef13ed-22d6-4df6-9318-f4e8be703a58" containerName="extract-content" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.183194 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ef13ed-22d6-4df6-9318-f4e8be703a58" containerName="extract-content" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.183408 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd6fe20f-e2e1-46ae-aa88-cbfd410076a2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.183451 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ef13ed-22d6-4df6-9318-f4e8be703a58" containerName="registry-server" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.184267 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.187225 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fgpt8" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.187454 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.187711 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.191198 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.220442 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw"] Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.314244 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04b83327-1210-4a1a-b104-70fff61786bf-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw\" (UID: \"04b83327-1210-4a1a-b104-70fff61786bf\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.314758 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxcpq\" (UniqueName: \"kubernetes.io/projected/04b83327-1210-4a1a-b104-70fff61786bf-kube-api-access-pxcpq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw\" (UID: \"04b83327-1210-4a1a-b104-70fff61786bf\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.314820 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04b83327-1210-4a1a-b104-70fff61786bf-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw\" (UID: \"04b83327-1210-4a1a-b104-70fff61786bf\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.416422 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxcpq\" (UniqueName: \"kubernetes.io/projected/04b83327-1210-4a1a-b104-70fff61786bf-kube-api-access-pxcpq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw\" (UID: \"04b83327-1210-4a1a-b104-70fff61786bf\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.416491 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04b83327-1210-4a1a-b104-70fff61786bf-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw\" (UID: \"04b83327-1210-4a1a-b104-70fff61786bf\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.416578 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04b83327-1210-4a1a-b104-70fff61786bf-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw\" (UID: \"04b83327-1210-4a1a-b104-70fff61786bf\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.427061 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04b83327-1210-4a1a-b104-70fff61786bf-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw\" (UID: \"04b83327-1210-4a1a-b104-70fff61786bf\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.427892 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04b83327-1210-4a1a-b104-70fff61786bf-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw\" (UID: \"04b83327-1210-4a1a-b104-70fff61786bf\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.442925 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxcpq\" (UniqueName: \"kubernetes.io/projected/04b83327-1210-4a1a-b104-70fff61786bf-kube-api-access-pxcpq\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw\" (UID: \"04b83327-1210-4a1a-b104-70fff61786bf\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw" Feb 24 03:21:20 crc kubenswrapper[4923]: I0224 03:21:20.520878 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw" Feb 24 03:21:21 crc kubenswrapper[4923]: I0224 03:21:21.072095 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw"] Feb 24 03:21:22 crc kubenswrapper[4923]: I0224 03:21:22.082498 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw" event={"ID":"04b83327-1210-4a1a-b104-70fff61786bf","Type":"ContainerStarted","Data":"c4486f693033b0fae8971d6153c5b1efd2f3de1365fd8c8544d38fb5eae904d1"} Feb 24 03:21:22 crc kubenswrapper[4923]: I0224 03:21:22.083020 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw" event={"ID":"04b83327-1210-4a1a-b104-70fff61786bf","Type":"ContainerStarted","Data":"cf1655446c84407822384258333a21c78afa9b57ab89275e002340c942a17350"} Feb 24 03:21:22 crc kubenswrapper[4923]: I0224 03:21:22.098152 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw" podStartSLOduration=1.597326452 podStartE2EDuration="2.098127749s" podCreationTimestamp="2026-02-24 03:21:20 +0000 UTC" firstStartedPulling="2026-02-24 03:21:21.069544205 +0000 UTC m=+1605.086615018" lastFinishedPulling="2026-02-24 03:21:21.570345492 +0000 UTC m=+1605.587416315" observedRunningTime="2026-02-24 03:21:22.09623597 +0000 UTC m=+1606.113306793" watchObservedRunningTime="2026-02-24 03:21:22.098127749 +0000 UTC m=+1606.115198582" Feb 24 03:21:42 crc kubenswrapper[4923]: I0224 03:21:42.168512 4923 scope.go:117] "RemoveContainer" containerID="d3ae1f24f0ca3e058c7eb043ee4ad5b971168ed8934c071880078616ee1f8084" Feb 24 03:22:17 crc kubenswrapper[4923]: I0224 03:22:17.061988 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-79bc-account-create-update-zfc9d"] Feb 24 03:22:17 crc kubenswrapper[4923]: I0224 03:22:17.079038 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-79bc-account-create-update-zfc9d"] Feb 24 03:22:17 crc kubenswrapper[4923]: I0224 03:22:17.090173 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-krv9d"] Feb 24 03:22:17 crc kubenswrapper[4923]: I0224 03:22:17.098800 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-krv9d"] Feb 24 03:22:17 crc kubenswrapper[4923]: I0224 03:22:17.733203 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="627f800c-b94f-43f0-b2be-f7da4d5cb178" path="/var/lib/kubelet/pods/627f800c-b94f-43f0-b2be-f7da4d5cb178/volumes" Feb 24 03:22:17 crc kubenswrapper[4923]: I0224 03:22:17.733979 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abba0568-43ed-4b23-9437-8da7ed288e99" path="/var/lib/kubelet/pods/abba0568-43ed-4b23-9437-8da7ed288e99/volumes" Feb 24 03:22:18 crc kubenswrapper[4923]: I0224 03:22:18.046377 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-2t5qn"] Feb 24 03:22:18 crc kubenswrapper[4923]: I0224 03:22:18.063693 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-351c-account-create-update-knt7d"] Feb 24 03:22:18 crc kubenswrapper[4923]: I0224 03:22:18.073229 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5bfc-account-create-update-z6s8t"] Feb 24 03:22:18 crc kubenswrapper[4923]: I0224 03:22:18.080107 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-xlmjx"] Feb 24 03:22:18 crc kubenswrapper[4923]: I0224 03:22:18.086721 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-351c-account-create-update-knt7d"] Feb 24 03:22:18 crc kubenswrapper[4923]: I0224 03:22:18.093357 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-2t5qn"] Feb 24 03:22:18 crc kubenswrapper[4923]: I0224 03:22:18.101109 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-xlmjx"] Feb 24 03:22:18 crc kubenswrapper[4923]: I0224 03:22:18.108751 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5bfc-account-create-update-z6s8t"] Feb 24 03:22:19 crc kubenswrapper[4923]: I0224 03:22:19.732095 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a78b2aa-1ea3-4b48-bc85-128afc2bfa06" path="/var/lib/kubelet/pods/3a78b2aa-1ea3-4b48-bc85-128afc2bfa06/volumes" Feb 24 03:22:19 crc kubenswrapper[4923]: I0224 03:22:19.733719 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d57faa09-2679-4d8f-92f5-45a2dccda444" path="/var/lib/kubelet/pods/d57faa09-2679-4d8f-92f5-45a2dccda444/volumes" Feb 24 03:22:19 crc kubenswrapper[4923]: I0224 03:22:19.735582 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9ab4036-b523-47e5-ac77-f346b3f4e60f" path="/var/lib/kubelet/pods/d9ab4036-b523-47e5-ac77-f346b3f4e60f/volumes" Feb 24 03:22:19 crc kubenswrapper[4923]: I0224 03:22:19.737482 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f714270c-0420-4d1f-92fc-24afd3587dfc" path="/var/lib/kubelet/pods/f714270c-0420-4d1f-92fc-24afd3587dfc/volumes" Feb 24 03:22:24 crc kubenswrapper[4923]: I0224 03:22:24.046270 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bs9t5"] Feb 24 03:22:24 crc kubenswrapper[4923]: I0224 03:22:24.066484 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bs9t5"] Feb 24 03:22:25 crc kubenswrapper[4923]: I0224 03:22:25.730436 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16745ab2-3d14-4ee0-9385-f3f8913b865e" path="/var/lib/kubelet/pods/16745ab2-3d14-4ee0-9385-f3f8913b865e/volumes" Feb 24 03:22:42 crc kubenswrapper[4923]: I0224 03:22:42.044728 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-s6tmr"] Feb 24 03:22:42 crc kubenswrapper[4923]: I0224 03:22:42.056592 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-s6tmr"] Feb 24 03:22:42 crc kubenswrapper[4923]: I0224 03:22:42.224579 4923 scope.go:117] "RemoveContainer" containerID="1efc4bb526339629c885d0b306bf264bcaeb482c4d4dbf127c9aa3929a278019" Feb 24 03:22:42 crc kubenswrapper[4923]: I0224 03:22:42.251445 4923 scope.go:117] "RemoveContainer" containerID="2ba02616a977174b628241c112670c43c58695cf30f3594f8357acff57882917" Feb 24 03:22:42 crc kubenswrapper[4923]: I0224 03:22:42.308150 4923 scope.go:117] "RemoveContainer" containerID="5e987cbbecc17dbadf8c01f3cdf877275c9210e566fb1e10242836ef9c2f4ad6" Feb 24 03:22:42 crc kubenswrapper[4923]: I0224 03:22:42.345861 4923 scope.go:117] "RemoveContainer" containerID="11c7a6d9684f5e25454deb86433fd0358abeb688f4a7bb99c8408450a620758a" Feb 24 03:22:42 crc kubenswrapper[4923]: I0224 03:22:42.396776 4923 scope.go:117] "RemoveContainer" containerID="d48a3ceb5f9b48d55905d1d1f01a9fd1ffedd09b0a6ab3af9813f7695a32207b" Feb 24 03:22:42 crc kubenswrapper[4923]: I0224 03:22:42.438162 4923 scope.go:117] "RemoveContainer" containerID="7eeaf19ae8f81b8e45e7b0e92abe21d4fe177989adcff7aa93b3d31836471735" Feb 24 03:22:42 crc kubenswrapper[4923]: I0224 03:22:42.478890 4923 scope.go:117] "RemoveContainer" containerID="82c4f785863b9a50a42bd9130a45b9860f893cb71fd5de77ec4a3ed236614634" Feb 24 03:22:43 crc kubenswrapper[4923]: I0224 03:22:43.730226 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0" path="/var/lib/kubelet/pods/72dcb6a8-2ea4-44ac-b9e3-0f1d7751bec0/volumes" Feb 24 03:22:45 crc kubenswrapper[4923]: I0224 03:22:45.072576 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-gpxn4"] Feb 24 03:22:45 crc kubenswrapper[4923]: I0224 03:22:45.085362 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-fjcwf"] Feb 24 03:22:45 crc kubenswrapper[4923]: I0224 03:22:45.100674 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-cd2f-account-create-update-l8bc9"] Feb 24 03:22:45 crc kubenswrapper[4923]: I0224 03:22:45.107991 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d178-account-create-update-pck7k"] Feb 24 03:22:45 crc kubenswrapper[4923]: I0224 03:22:45.115183 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-1a60-account-create-update-9zjxd"] Feb 24 03:22:45 crc kubenswrapper[4923]: I0224 03:22:45.123314 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-gpxn4"] Feb 24 03:22:45 crc kubenswrapper[4923]: I0224 03:22:45.130693 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-lpjk7"] Feb 24 03:22:45 crc kubenswrapper[4923]: I0224 03:22:45.138825 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-1a60-account-create-update-9zjxd"] Feb 24 03:22:45 crc kubenswrapper[4923]: I0224 03:22:45.145992 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-fjcwf"] Feb 24 03:22:45 crc kubenswrapper[4923]: I0224 03:22:45.153738 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-cd2f-account-create-update-l8bc9"] Feb 24 03:22:45 crc kubenswrapper[4923]: I0224 03:22:45.161003 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-d178-account-create-update-pck7k"] Feb 24 03:22:45 crc kubenswrapper[4923]: I0224 03:22:45.169173 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-lpjk7"] Feb 24 03:22:45 crc kubenswrapper[4923]: I0224 03:22:45.730883 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02b56c32-f855-4c43-a006-c546a59e977f" path="/var/lib/kubelet/pods/02b56c32-f855-4c43-a006-c546a59e977f/volumes" Feb 24 03:22:45 crc kubenswrapper[4923]: I0224 03:22:45.732878 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2194e053-012e-4478-aa2c-70dceb03dc7a" path="/var/lib/kubelet/pods/2194e053-012e-4478-aa2c-70dceb03dc7a/volumes" Feb 24 03:22:45 crc kubenswrapper[4923]: I0224 03:22:45.734134 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33de35a6-2c83-4492-8ca5-103c030007ea" path="/var/lib/kubelet/pods/33de35a6-2c83-4492-8ca5-103c030007ea/volumes" Feb 24 03:22:45 crc kubenswrapper[4923]: I0224 03:22:45.735004 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="813d78f2-56a0-4658-b5fd-ab17e99db899" path="/var/lib/kubelet/pods/813d78f2-56a0-4658-b5fd-ab17e99db899/volumes" Feb 24 03:22:45 crc kubenswrapper[4923]: I0224 03:22:45.735768 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7328613-da6b-4e74-9857-d0d8a799c505" path="/var/lib/kubelet/pods/b7328613-da6b-4e74-9857-d0d8a799c505/volumes" Feb 24 03:22:45 crc kubenswrapper[4923]: I0224 03:22:45.736533 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfb8e452-36b8-4359-b5ed-34499c3b4fa4" path="/var/lib/kubelet/pods/bfb8e452-36b8-4359-b5ed-34499c3b4fa4/volumes" Feb 24 03:22:52 crc kubenswrapper[4923]: I0224 03:22:52.985816 4923 generic.go:334] "Generic (PLEG): container finished" podID="04b83327-1210-4a1a-b104-70fff61786bf" containerID="c4486f693033b0fae8971d6153c5b1efd2f3de1365fd8c8544d38fb5eae904d1" exitCode=0 Feb 24 03:22:52 crc kubenswrapper[4923]: I0224 03:22:52.985932 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw" event={"ID":"04b83327-1210-4a1a-b104-70fff61786bf","Type":"ContainerDied","Data":"c4486f693033b0fae8971d6153c5b1efd2f3de1365fd8c8544d38fb5eae904d1"} Feb 24 03:22:54 crc kubenswrapper[4923]: I0224 03:22:54.058313 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-qq8ck"] Feb 24 03:22:54 crc kubenswrapper[4923]: I0224 03:22:54.074146 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-qq8ck"] Feb 24 03:22:54 crc kubenswrapper[4923]: I0224 03:22:54.516465 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw" Feb 24 03:22:54 crc kubenswrapper[4923]: I0224 03:22:54.715404 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxcpq\" (UniqueName: \"kubernetes.io/projected/04b83327-1210-4a1a-b104-70fff61786bf-kube-api-access-pxcpq\") pod \"04b83327-1210-4a1a-b104-70fff61786bf\" (UID: \"04b83327-1210-4a1a-b104-70fff61786bf\") " Feb 24 03:22:54 crc kubenswrapper[4923]: I0224 03:22:54.715489 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04b83327-1210-4a1a-b104-70fff61786bf-inventory\") pod \"04b83327-1210-4a1a-b104-70fff61786bf\" (UID: \"04b83327-1210-4a1a-b104-70fff61786bf\") " Feb 24 03:22:54 crc kubenswrapper[4923]: I0224 03:22:54.715839 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04b83327-1210-4a1a-b104-70fff61786bf-ssh-key-openstack-edpm-ipam\") pod \"04b83327-1210-4a1a-b104-70fff61786bf\" (UID: \"04b83327-1210-4a1a-b104-70fff61786bf\") " Feb 24 03:22:54 crc kubenswrapper[4923]: I0224 03:22:54.723925 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b83327-1210-4a1a-b104-70fff61786bf-kube-api-access-pxcpq" (OuterVolumeSpecName: "kube-api-access-pxcpq") pod "04b83327-1210-4a1a-b104-70fff61786bf" (UID: "04b83327-1210-4a1a-b104-70fff61786bf"). InnerVolumeSpecName "kube-api-access-pxcpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:22:54 crc kubenswrapper[4923]: I0224 03:22:54.753231 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b83327-1210-4a1a-b104-70fff61786bf-inventory" (OuterVolumeSpecName: "inventory") pod "04b83327-1210-4a1a-b104-70fff61786bf" (UID: "04b83327-1210-4a1a-b104-70fff61786bf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:22:54 crc kubenswrapper[4923]: I0224 03:22:54.762252 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b83327-1210-4a1a-b104-70fff61786bf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "04b83327-1210-4a1a-b104-70fff61786bf" (UID: "04b83327-1210-4a1a-b104-70fff61786bf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:22:54 crc kubenswrapper[4923]: I0224 03:22:54.818353 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxcpq\" (UniqueName: \"kubernetes.io/projected/04b83327-1210-4a1a-b104-70fff61786bf-kube-api-access-pxcpq\") on node \"crc\" DevicePath \"\"" Feb 24 03:22:54 crc kubenswrapper[4923]: I0224 03:22:54.818393 4923 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04b83327-1210-4a1a-b104-70fff61786bf-inventory\") on node \"crc\" DevicePath \"\"" Feb 24 03:22:54 crc kubenswrapper[4923]: I0224 03:22:54.818405 4923 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04b83327-1210-4a1a-b104-70fff61786bf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.016068 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw" event={"ID":"04b83327-1210-4a1a-b104-70fff61786bf","Type":"ContainerDied","Data":"cf1655446c84407822384258333a21c78afa9b57ab89275e002340c942a17350"} Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.016122 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf1655446c84407822384258333a21c78afa9b57ab89275e002340c942a17350" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.016186 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.102579 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2"] Feb 24 03:22:55 crc kubenswrapper[4923]: E0224 03:22:55.103112 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b83327-1210-4a1a-b104-70fff61786bf" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.103129 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b83327-1210-4a1a-b104-70fff61786bf" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.103390 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b83327-1210-4a1a-b104-70fff61786bf" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.104148 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.109065 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.109752 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.109971 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2"] Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.110107 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.110309 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fgpt8" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.131959 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2\" (UID: \"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.132109 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjvst\" (UniqueName: \"kubernetes.io/projected/8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355-kube-api-access-wjvst\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2\" (UID: \"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.132330 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2\" (UID: \"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.234264 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2\" (UID: \"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.234357 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjvst\" (UniqueName: \"kubernetes.io/projected/8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355-kube-api-access-wjvst\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2\" (UID: \"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.234459 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2\" (UID: \"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.242100 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2\" (UID: \"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.243178 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2\" (UID: \"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.252516 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjvst\" (UniqueName: \"kubernetes.io/projected/8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355-kube-api-access-wjvst\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2\" (UID: \"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.437410 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.739170 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49b6f21a-18fb-4c73-8286-32f1def45bac" path="/var/lib/kubelet/pods/49b6f21a-18fb-4c73-8286-32f1def45bac/volumes" Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.769233 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2"] Feb 24 03:22:55 crc kubenswrapper[4923]: W0224 03:22:55.771401 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dbbe8ec_f9b0_4dfe_a1ae_63ff9e7f1355.slice/crio-c02b35e93098e8dbdb77d90b8da1796298cf4863e713653252584f77404e3d5b WatchSource:0}: Error finding container c02b35e93098e8dbdb77d90b8da1796298cf4863e713653252584f77404e3d5b: Status 404 returned error can't find the container with id c02b35e93098e8dbdb77d90b8da1796298cf4863e713653252584f77404e3d5b Feb 24 03:22:55 crc kubenswrapper[4923]: I0224 03:22:55.773942 4923 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 03:22:56 crc kubenswrapper[4923]: I0224 03:22:56.030006 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2" event={"ID":"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355","Type":"ContainerStarted","Data":"c02b35e93098e8dbdb77d90b8da1796298cf4863e713653252584f77404e3d5b"} Feb 24 03:22:57 crc kubenswrapper[4923]: I0224 03:22:57.042460 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2" event={"ID":"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355","Type":"ContainerStarted","Data":"8a998f678e700c1d5b99b54b869b40e592b140d1030cc5bbbfe847e446fe3966"} Feb 24 03:22:57 crc kubenswrapper[4923]: I0224 03:22:57.064112 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2" podStartSLOduration=1.397703647 podStartE2EDuration="2.064090761s" podCreationTimestamp="2026-02-24 03:22:55 +0000 UTC" firstStartedPulling="2026-02-24 03:22:55.773700911 +0000 UTC m=+1699.790771734" lastFinishedPulling="2026-02-24 03:22:56.440088035 +0000 UTC m=+1700.457158848" observedRunningTime="2026-02-24 03:22:57.060922508 +0000 UTC m=+1701.077993341" watchObservedRunningTime="2026-02-24 03:22:57.064090761 +0000 UTC m=+1701.081161574" Feb 24 03:23:19 crc kubenswrapper[4923]: I0224 03:23:19.916588 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:23:19 crc kubenswrapper[4923]: I0224 03:23:19.917084 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:23:23 crc kubenswrapper[4923]: I0224 03:23:23.064737 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-brpzr"] Feb 24 03:23:23 crc kubenswrapper[4923]: I0224 03:23:23.072042 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-brpzr"] Feb 24 03:23:23 crc kubenswrapper[4923]: I0224 03:23:23.726466 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4673b1fa-d73c-48c9-b2fd-d0f7afe97efd" path="/var/lib/kubelet/pods/4673b1fa-d73c-48c9-b2fd-d0f7afe97efd/volumes" Feb 24 03:23:29 crc kubenswrapper[4923]: I0224 03:23:29.045546 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-ds4gb"] Feb 24 03:23:29 crc kubenswrapper[4923]: I0224 03:23:29.055885 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-ds4gb"] Feb 24 03:23:29 crc kubenswrapper[4923]: I0224 03:23:29.731899 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0d1f021-7b1a-491b-9dd5-90d6425bcde7" path="/var/lib/kubelet/pods/d0d1f021-7b1a-491b-9dd5-90d6425bcde7/volumes" Feb 24 03:23:37 crc kubenswrapper[4923]: I0224 03:23:37.041871 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bqtmp"] Feb 24 03:23:37 crc kubenswrapper[4923]: I0224 03:23:37.059159 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bqtmp"] Feb 24 03:23:37 crc kubenswrapper[4923]: I0224 03:23:37.733603 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf4b1a04-0555-4f02-b5bc-a8f90141e8dc" path="/var/lib/kubelet/pods/cf4b1a04-0555-4f02-b5bc-a8f90141e8dc/volumes" Feb 24 03:23:42 crc kubenswrapper[4923]: I0224 03:23:42.607484 4923 scope.go:117] "RemoveContainer" containerID="0614963ad2e1c963986cba6667764f2345d8c4c2d13c882a196321835b8be31e" Feb 24 03:23:42 crc kubenswrapper[4923]: I0224 03:23:42.635899 4923 scope.go:117] "RemoveContainer" containerID="9a57243cfbd13d04565d801702db8cce47d931c3215f3134c3a0f4cb9e99d173" Feb 24 03:23:42 crc kubenswrapper[4923]: I0224 03:23:42.705428 4923 scope.go:117] "RemoveContainer" containerID="03e8bda9ede338d2a1a471ee37b9b128d3ab6960e598c9ffbd75a7b0af61f2fd" Feb 24 03:23:42 crc kubenswrapper[4923]: I0224 03:23:42.796805 4923 scope.go:117] "RemoveContainer" containerID="7af080accce9475d950c8fde299faefe8047b53f7cb4ff3d2122f0717760e5cf" Feb 24 03:23:42 crc kubenswrapper[4923]: I0224 03:23:42.853781 4923 scope.go:117] "RemoveContainer" containerID="5a56240bfb14e5676d37cfad63c22505a75168bd7630cd8fbd3d0dc237a79ea6" Feb 24 03:23:42 crc kubenswrapper[4923]: I0224 03:23:42.892749 4923 scope.go:117] "RemoveContainer" containerID="5c2500c5dc77a1effb5c1da39cebeecfa432b765951689e57690b2fa16bc1a6f" Feb 24 03:23:42 crc kubenswrapper[4923]: I0224 03:23:42.950694 4923 scope.go:117] "RemoveContainer" containerID="f0411f4396fdafa783cbeff142f86afa3c7fa450917b01300c5520b5fae25405" Feb 24 03:23:42 crc kubenswrapper[4923]: I0224 03:23:42.982107 4923 scope.go:117] "RemoveContainer" containerID="d04673a161d50866cd3470d985fb7ff18c49c4cd31b1193ba6059fb111199c83" Feb 24 03:23:43 crc kubenswrapper[4923]: I0224 03:23:43.002462 4923 scope.go:117] "RemoveContainer" containerID="02795f14259e3be3db9608da5152c23a7606f40060e396cfffc3b567c9ffc3ec" Feb 24 03:23:43 crc kubenswrapper[4923]: I0224 03:23:43.038472 4923 scope.go:117] "RemoveContainer" containerID="b2296e9fb1a8750cad9c0817f5ef39ba4dbe840ed0490c45e1d95fa96444b076" Feb 24 03:23:43 crc kubenswrapper[4923]: I0224 03:23:43.058050 4923 scope.go:117] "RemoveContainer" containerID="50a88ed7328adc04b18a6a4b686a6385c3e4a6466c70a598270a5448168c54dd" Feb 24 03:23:44 crc kubenswrapper[4923]: I0224 03:23:44.041796 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-rvlxb"] Feb 24 03:23:44 crc kubenswrapper[4923]: I0224 03:23:44.048675 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-rvlxb"] Feb 24 03:23:45 crc kubenswrapper[4923]: I0224 03:23:45.736601 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29f64c48-5ed4-431c-8636-702a8abf02b5" path="/var/lib/kubelet/pods/29f64c48-5ed4-431c-8636-702a8abf02b5/volumes" Feb 24 03:23:47 crc kubenswrapper[4923]: I0224 03:23:47.037557 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-gdl2d"] Feb 24 03:23:47 crc kubenswrapper[4923]: I0224 03:23:47.047331 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-gdl2d"] Feb 24 03:23:47 crc kubenswrapper[4923]: I0224 03:23:47.729118 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ad7cefc-c3bb-48ff-ab05-0fe707823e84" path="/var/lib/kubelet/pods/8ad7cefc-c3bb-48ff-ab05-0fe707823e84/volumes" Feb 24 03:23:49 crc kubenswrapper[4923]: I0224 03:23:49.916240 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:23:49 crc kubenswrapper[4923]: I0224 03:23:49.916606 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:24:05 crc kubenswrapper[4923]: I0224 03:24:05.733813 4923 generic.go:334] "Generic (PLEG): container finished" podID="8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355" containerID="8a998f678e700c1d5b99b54b869b40e592b140d1030cc5bbbfe847e446fe3966" exitCode=0 Feb 24 03:24:05 crc kubenswrapper[4923]: I0224 03:24:05.735061 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2" event={"ID":"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355","Type":"ContainerDied","Data":"8a998f678e700c1d5b99b54b869b40e592b140d1030cc5bbbfe847e446fe3966"} Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.172805 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.296167 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355-ssh-key-openstack-edpm-ipam\") pod \"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355\" (UID: \"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355\") " Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.296322 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjvst\" (UniqueName: \"kubernetes.io/projected/8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355-kube-api-access-wjvst\") pod \"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355\" (UID: \"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355\") " Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.296529 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355-inventory\") pod \"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355\" (UID: \"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355\") " Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.301689 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355-kube-api-access-wjvst" (OuterVolumeSpecName: "kube-api-access-wjvst") pod "8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355" (UID: "8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355"). InnerVolumeSpecName "kube-api-access-wjvst". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.331383 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355" (UID: "8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.331857 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355-inventory" (OuterVolumeSpecName: "inventory") pod "8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355" (UID: "8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.399672 4923 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355-inventory\") on node \"crc\" DevicePath \"\"" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.399746 4923 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.399764 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjvst\" (UniqueName: \"kubernetes.io/projected/8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355-kube-api-access-wjvst\") on node \"crc\" DevicePath \"\"" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.754959 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2" event={"ID":"8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355","Type":"ContainerDied","Data":"c02b35e93098e8dbdb77d90b8da1796298cf4863e713653252584f77404e3d5b"} Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.755007 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c02b35e93098e8dbdb77d90b8da1796298cf4863e713653252584f77404e3d5b" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.755080 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.856173 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8"] Feb 24 03:24:07 crc kubenswrapper[4923]: E0224 03:24:07.857087 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.857199 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.857818 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.858671 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.860607 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fgpt8" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.861279 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.861594 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.861715 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.867211 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8"] Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.920955 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/890d3a1a-7dcc-4033-95c0-a3507815e8ff-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8\" (UID: \"890d3a1a-7dcc-4033-95c0-a3507815e8ff\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.921022 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cnc8\" (UniqueName: \"kubernetes.io/projected/890d3a1a-7dcc-4033-95c0-a3507815e8ff-kube-api-access-8cnc8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8\" (UID: \"890d3a1a-7dcc-4033-95c0-a3507815e8ff\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8" Feb 24 03:24:07 crc kubenswrapper[4923]: I0224 03:24:07.921233 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/890d3a1a-7dcc-4033-95c0-a3507815e8ff-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8\" (UID: \"890d3a1a-7dcc-4033-95c0-a3507815e8ff\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8" Feb 24 03:24:08 crc kubenswrapper[4923]: I0224 03:24:08.023286 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/890d3a1a-7dcc-4033-95c0-a3507815e8ff-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8\" (UID: \"890d3a1a-7dcc-4033-95c0-a3507815e8ff\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8" Feb 24 03:24:08 crc kubenswrapper[4923]: I0224 03:24:08.023430 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/890d3a1a-7dcc-4033-95c0-a3507815e8ff-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8\" (UID: \"890d3a1a-7dcc-4033-95c0-a3507815e8ff\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8" Feb 24 03:24:08 crc kubenswrapper[4923]: I0224 03:24:08.023467 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cnc8\" (UniqueName: \"kubernetes.io/projected/890d3a1a-7dcc-4033-95c0-a3507815e8ff-kube-api-access-8cnc8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8\" (UID: \"890d3a1a-7dcc-4033-95c0-a3507815e8ff\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8" Feb 24 03:24:08 crc kubenswrapper[4923]: I0224 03:24:08.026871 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/890d3a1a-7dcc-4033-95c0-a3507815e8ff-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8\" (UID: \"890d3a1a-7dcc-4033-95c0-a3507815e8ff\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8" Feb 24 03:24:08 crc kubenswrapper[4923]: I0224 03:24:08.028114 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/890d3a1a-7dcc-4033-95c0-a3507815e8ff-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8\" (UID: \"890d3a1a-7dcc-4033-95c0-a3507815e8ff\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8" Feb 24 03:24:08 crc kubenswrapper[4923]: I0224 03:24:08.040233 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cnc8\" (UniqueName: \"kubernetes.io/projected/890d3a1a-7dcc-4033-95c0-a3507815e8ff-kube-api-access-8cnc8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8\" (UID: \"890d3a1a-7dcc-4033-95c0-a3507815e8ff\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8" Feb 24 03:24:08 crc kubenswrapper[4923]: I0224 03:24:08.235387 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8" Feb 24 03:24:08 crc kubenswrapper[4923]: I0224 03:24:08.790112 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8"] Feb 24 03:24:09 crc kubenswrapper[4923]: I0224 03:24:09.773953 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8" event={"ID":"890d3a1a-7dcc-4033-95c0-a3507815e8ff","Type":"ContainerStarted","Data":"cb3c8ee55f6bd588f1af1de82e545a04d95bfd712d46117eb4f8cc55dd08115b"} Feb 24 03:24:09 crc kubenswrapper[4923]: I0224 03:24:09.775411 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8" event={"ID":"890d3a1a-7dcc-4033-95c0-a3507815e8ff","Type":"ContainerStarted","Data":"277335a07fced26778ed4e58a633ea23054f775175e90f424b979b5467685020"} Feb 24 03:24:09 crc kubenswrapper[4923]: I0224 03:24:09.788876 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8" podStartSLOduration=2.27857412 podStartE2EDuration="2.788850058s" podCreationTimestamp="2026-02-24 03:24:07 +0000 UTC" firstStartedPulling="2026-02-24 03:24:08.795165212 +0000 UTC m=+1772.812236035" lastFinishedPulling="2026-02-24 03:24:09.30544116 +0000 UTC m=+1773.322511973" observedRunningTime="2026-02-24 03:24:09.788096628 +0000 UTC m=+1773.805167441" watchObservedRunningTime="2026-02-24 03:24:09.788850058 +0000 UTC m=+1773.805920871" Feb 24 03:24:14 crc kubenswrapper[4923]: I0224 03:24:14.822769 4923 generic.go:334] "Generic (PLEG): container finished" podID="890d3a1a-7dcc-4033-95c0-a3507815e8ff" containerID="cb3c8ee55f6bd588f1af1de82e545a04d95bfd712d46117eb4f8cc55dd08115b" exitCode=0 Feb 24 03:24:14 crc kubenswrapper[4923]: I0224 03:24:14.822869 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8" event={"ID":"890d3a1a-7dcc-4033-95c0-a3507815e8ff","Type":"ContainerDied","Data":"cb3c8ee55f6bd588f1af1de82e545a04d95bfd712d46117eb4f8cc55dd08115b"} Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.239936 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.276889 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cnc8\" (UniqueName: \"kubernetes.io/projected/890d3a1a-7dcc-4033-95c0-a3507815e8ff-kube-api-access-8cnc8\") pod \"890d3a1a-7dcc-4033-95c0-a3507815e8ff\" (UID: \"890d3a1a-7dcc-4033-95c0-a3507815e8ff\") " Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.277022 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/890d3a1a-7dcc-4033-95c0-a3507815e8ff-ssh-key-openstack-edpm-ipam\") pod \"890d3a1a-7dcc-4033-95c0-a3507815e8ff\" (UID: \"890d3a1a-7dcc-4033-95c0-a3507815e8ff\") " Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.277154 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/890d3a1a-7dcc-4033-95c0-a3507815e8ff-inventory\") pod \"890d3a1a-7dcc-4033-95c0-a3507815e8ff\" (UID: \"890d3a1a-7dcc-4033-95c0-a3507815e8ff\") " Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.283625 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/890d3a1a-7dcc-4033-95c0-a3507815e8ff-kube-api-access-8cnc8" (OuterVolumeSpecName: "kube-api-access-8cnc8") pod "890d3a1a-7dcc-4033-95c0-a3507815e8ff" (UID: "890d3a1a-7dcc-4033-95c0-a3507815e8ff"). InnerVolumeSpecName "kube-api-access-8cnc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.304166 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/890d3a1a-7dcc-4033-95c0-a3507815e8ff-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "890d3a1a-7dcc-4033-95c0-a3507815e8ff" (UID: "890d3a1a-7dcc-4033-95c0-a3507815e8ff"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.304695 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/890d3a1a-7dcc-4033-95c0-a3507815e8ff-inventory" (OuterVolumeSpecName: "inventory") pod "890d3a1a-7dcc-4033-95c0-a3507815e8ff" (UID: "890d3a1a-7dcc-4033-95c0-a3507815e8ff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.379191 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cnc8\" (UniqueName: \"kubernetes.io/projected/890d3a1a-7dcc-4033-95c0-a3507815e8ff-kube-api-access-8cnc8\") on node \"crc\" DevicePath \"\"" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.379221 4923 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/890d3a1a-7dcc-4033-95c0-a3507815e8ff-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.379231 4923 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/890d3a1a-7dcc-4033-95c0-a3507815e8ff-inventory\") on node \"crc\" DevicePath \"\"" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.847852 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8" event={"ID":"890d3a1a-7dcc-4033-95c0-a3507815e8ff","Type":"ContainerDied","Data":"277335a07fced26778ed4e58a633ea23054f775175e90f424b979b5467685020"} Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.847922 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.847947 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="277335a07fced26778ed4e58a633ea23054f775175e90f424b979b5467685020" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.951419 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp"] Feb 24 03:24:16 crc kubenswrapper[4923]: E0224 03:24:16.951912 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890d3a1a-7dcc-4033-95c0-a3507815e8ff" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.951937 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="890d3a1a-7dcc-4033-95c0-a3507815e8ff" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.952192 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="890d3a1a-7dcc-4033-95c0-a3507815e8ff" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.952939 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.955691 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fgpt8" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.955752 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.958406 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.958689 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.961577 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp"] Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.990272 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/140f3efa-43c3-4d0b-a738-fc87e216c13b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sg2sp\" (UID: \"140f3efa-43c3-4d0b-a738-fc87e216c13b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.990349 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d76z9\" (UniqueName: \"kubernetes.io/projected/140f3efa-43c3-4d0b-a738-fc87e216c13b-kube-api-access-d76z9\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sg2sp\" (UID: \"140f3efa-43c3-4d0b-a738-fc87e216c13b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp" Feb 24 03:24:16 crc kubenswrapper[4923]: I0224 03:24:16.990651 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/140f3efa-43c3-4d0b-a738-fc87e216c13b-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sg2sp\" (UID: \"140f3efa-43c3-4d0b-a738-fc87e216c13b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp" Feb 24 03:24:17 crc kubenswrapper[4923]: I0224 03:24:17.092759 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/140f3efa-43c3-4d0b-a738-fc87e216c13b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sg2sp\" (UID: \"140f3efa-43c3-4d0b-a738-fc87e216c13b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp" Feb 24 03:24:17 crc kubenswrapper[4923]: I0224 03:24:17.092810 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d76z9\" (UniqueName: \"kubernetes.io/projected/140f3efa-43c3-4d0b-a738-fc87e216c13b-kube-api-access-d76z9\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sg2sp\" (UID: \"140f3efa-43c3-4d0b-a738-fc87e216c13b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp" Feb 24 03:24:17 crc kubenswrapper[4923]: I0224 03:24:17.092903 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/140f3efa-43c3-4d0b-a738-fc87e216c13b-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sg2sp\" (UID: \"140f3efa-43c3-4d0b-a738-fc87e216c13b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp" Feb 24 03:24:17 crc kubenswrapper[4923]: I0224 03:24:17.097035 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/140f3efa-43c3-4d0b-a738-fc87e216c13b-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sg2sp\" (UID: \"140f3efa-43c3-4d0b-a738-fc87e216c13b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp" Feb 24 03:24:17 crc kubenswrapper[4923]: I0224 03:24:17.097040 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/140f3efa-43c3-4d0b-a738-fc87e216c13b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sg2sp\" (UID: \"140f3efa-43c3-4d0b-a738-fc87e216c13b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp" Feb 24 03:24:17 crc kubenswrapper[4923]: I0224 03:24:17.113814 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d76z9\" (UniqueName: \"kubernetes.io/projected/140f3efa-43c3-4d0b-a738-fc87e216c13b-kube-api-access-d76z9\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-sg2sp\" (UID: \"140f3efa-43c3-4d0b-a738-fc87e216c13b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp" Feb 24 03:24:17 crc kubenswrapper[4923]: I0224 03:24:17.275917 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp" Feb 24 03:24:17 crc kubenswrapper[4923]: I0224 03:24:17.915368 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp"] Feb 24 03:24:18 crc kubenswrapper[4923]: I0224 03:24:18.874206 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp" event={"ID":"140f3efa-43c3-4d0b-a738-fc87e216c13b","Type":"ContainerStarted","Data":"b976ee3db1f27ed57b031990835214f031b01a6d4f91c71fc9f3e5545620bfbe"} Feb 24 03:24:18 crc kubenswrapper[4923]: I0224 03:24:18.874782 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp" event={"ID":"140f3efa-43c3-4d0b-a738-fc87e216c13b","Type":"ContainerStarted","Data":"007af0dea1f34177e6bad46933770ce0c4bb2b69bf4c5b2650c7733f0af3eaee"} Feb 24 03:24:18 crc kubenswrapper[4923]: I0224 03:24:18.906889 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp" podStartSLOduration=2.501034237 podStartE2EDuration="2.90686666s" podCreationTimestamp="2026-02-24 03:24:16 +0000 UTC" firstStartedPulling="2026-02-24 03:24:17.909368834 +0000 UTC m=+1781.926439687" lastFinishedPulling="2026-02-24 03:24:18.315201287 +0000 UTC m=+1782.332272110" observedRunningTime="2026-02-24 03:24:18.892438414 +0000 UTC m=+1782.909509237" watchObservedRunningTime="2026-02-24 03:24:18.90686666 +0000 UTC m=+1782.923937483" Feb 24 03:24:19 crc kubenswrapper[4923]: I0224 03:24:19.918739 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:24:19 crc kubenswrapper[4923]: I0224 03:24:19.919265 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:24:19 crc kubenswrapper[4923]: I0224 03:24:19.919421 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 03:24:19 crc kubenswrapper[4923]: I0224 03:24:19.920575 4923 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6"} pod="openshift-machine-config-operator/machine-config-daemon-rh26t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 03:24:19 crc kubenswrapper[4923]: I0224 03:24:19.920682 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" containerID="cri-o://e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" gracePeriod=600 Feb 24 03:24:20 crc kubenswrapper[4923]: E0224 03:24:20.053153 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:24:20 crc kubenswrapper[4923]: I0224 03:24:20.897887 4923 generic.go:334] "Generic (PLEG): container finished" podID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" exitCode=0 Feb 24 03:24:20 crc kubenswrapper[4923]: I0224 03:24:20.897972 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerDied","Data":"e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6"} Feb 24 03:24:20 crc kubenswrapper[4923]: I0224 03:24:20.898326 4923 scope.go:117] "RemoveContainer" containerID="0faeb363e0b14f83a047ef04f8fa2df18f1991b14418890ba609de06ecd5c251" Feb 24 03:24:20 crc kubenswrapper[4923]: I0224 03:24:20.899280 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:24:20 crc kubenswrapper[4923]: E0224 03:24:20.899814 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:24:24 crc kubenswrapper[4923]: I0224 03:24:24.902931 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-c7ns9"] Feb 24 03:24:24 crc kubenswrapper[4923]: I0224 03:24:24.913112 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-03dc-account-create-update-blt67"] Feb 24 03:24:24 crc kubenswrapper[4923]: I0224 03:24:24.921744 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-fde5-account-create-update-bkgr4"] Feb 24 03:24:24 crc kubenswrapper[4923]: I0224 03:24:24.928718 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0144-account-create-update-j6dm5"] Feb 24 03:24:24 crc kubenswrapper[4923]: I0224 03:24:24.935156 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-lplvk"] Feb 24 03:24:24 crc kubenswrapper[4923]: I0224 03:24:24.941499 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-2xw9b"] Feb 24 03:24:24 crc kubenswrapper[4923]: I0224 03:24:24.950186 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-c7ns9"] Feb 24 03:24:24 crc kubenswrapper[4923]: I0224 03:24:24.956627 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-03dc-account-create-update-blt67"] Feb 24 03:24:24 crc kubenswrapper[4923]: I0224 03:24:24.964287 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0144-account-create-update-j6dm5"] Feb 24 03:24:24 crc kubenswrapper[4923]: I0224 03:24:24.971405 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-fde5-account-create-update-bkgr4"] Feb 24 03:24:24 crc kubenswrapper[4923]: I0224 03:24:24.977224 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-lplvk"] Feb 24 03:24:24 crc kubenswrapper[4923]: I0224 03:24:24.982858 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-2xw9b"] Feb 24 03:24:25 crc kubenswrapper[4923]: I0224 03:24:25.726204 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2465587c-f0e4-4755-acd6-016d2b1a4cbf" path="/var/lib/kubelet/pods/2465587c-f0e4-4755-acd6-016d2b1a4cbf/volumes" Feb 24 03:24:25 crc kubenswrapper[4923]: I0224 03:24:25.727679 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="395dac89-2c47-446a-8254-0ba691868651" path="/var/lib/kubelet/pods/395dac89-2c47-446a-8254-0ba691868651/volumes" Feb 24 03:24:25 crc kubenswrapper[4923]: I0224 03:24:25.728949 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80844021-cbc2-4e7f-bbd8-1bac3ae22d98" path="/var/lib/kubelet/pods/80844021-cbc2-4e7f-bbd8-1bac3ae22d98/volumes" Feb 24 03:24:25 crc kubenswrapper[4923]: I0224 03:24:25.730117 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92b92176-765c-478f-a4df-26d52a903476" path="/var/lib/kubelet/pods/92b92176-765c-478f-a4df-26d52a903476/volumes" Feb 24 03:24:25 crc kubenswrapper[4923]: I0224 03:24:25.732180 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d86572db-86bc-4cd2-b551-a66bf5c47c7a" path="/var/lib/kubelet/pods/d86572db-86bc-4cd2-b551-a66bf5c47c7a/volumes" Feb 24 03:24:25 crc kubenswrapper[4923]: I0224 03:24:25.733581 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f491572b-dcd8-40f5-96e0-d6393b852858" path="/var/lib/kubelet/pods/f491572b-dcd8-40f5-96e0-d6393b852858/volumes" Feb 24 03:24:34 crc kubenswrapper[4923]: I0224 03:24:34.714014 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:24:34 crc kubenswrapper[4923]: E0224 03:24:34.715262 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:24:43 crc kubenswrapper[4923]: I0224 03:24:43.254016 4923 scope.go:117] "RemoveContainer" containerID="bafe4c81209226219c61c81e8f739f9ef46acfd44534f718c33bc0d3edfd688c" Feb 24 03:24:43 crc kubenswrapper[4923]: I0224 03:24:43.283449 4923 scope.go:117] "RemoveContainer" containerID="37eb58f3e17d647a959a2ac1115b7933598482b86904200c5c5c2f9e0e7dbb47" Feb 24 03:24:43 crc kubenswrapper[4923]: I0224 03:24:43.358906 4923 scope.go:117] "RemoveContainer" containerID="cd964862e704eeaad08f08ea2b13d80bf6211d9a083485cc621c49d7a49ef805" Feb 24 03:24:43 crc kubenswrapper[4923]: I0224 03:24:43.405170 4923 scope.go:117] "RemoveContainer" containerID="858d5a53348c622d378ba1c9b0db685104423bf71d15ff285f867b6208fdf657" Feb 24 03:24:43 crc kubenswrapper[4923]: I0224 03:24:43.453513 4923 scope.go:117] "RemoveContainer" containerID="f2e0dd235d7bc4fb88fb72b7bfb56721c5e7b331d69cd81a9ca05dcfbaa64d84" Feb 24 03:24:43 crc kubenswrapper[4923]: I0224 03:24:43.493207 4923 scope.go:117] "RemoveContainer" containerID="986592f6665163ed2d2a4d9717fc92bb03a0a2a4b22849ea18722264ec565113" Feb 24 03:24:43 crc kubenswrapper[4923]: I0224 03:24:43.531342 4923 scope.go:117] "RemoveContainer" containerID="618aa6a6ca8bf3941d9f7264ebf924c5e788d9a4cc3d5451ab4cf331a09c9506" Feb 24 03:24:43 crc kubenswrapper[4923]: I0224 03:24:43.547909 4923 scope.go:117] "RemoveContainer" containerID="75d57347f54d6b4778b6aac2f865ba00f5472bb0064d1ce6be80c984c7df2aa1" Feb 24 03:24:48 crc kubenswrapper[4923]: I0224 03:24:48.713092 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:24:48 crc kubenswrapper[4923]: E0224 03:24:48.713907 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:24:53 crc kubenswrapper[4923]: I0224 03:24:53.043756 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qlhvh"] Feb 24 03:24:53 crc kubenswrapper[4923]: I0224 03:24:53.050220 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qlhvh"] Feb 24 03:24:53 crc kubenswrapper[4923]: I0224 03:24:53.723676 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5bfccf6-bf49-4a31-9367-afe9f29cbf74" path="/var/lib/kubelet/pods/a5bfccf6-bf49-4a31-9367-afe9f29cbf74/volumes" Feb 24 03:24:56 crc kubenswrapper[4923]: I0224 03:24:56.228041 4923 generic.go:334] "Generic (PLEG): container finished" podID="140f3efa-43c3-4d0b-a738-fc87e216c13b" containerID="b976ee3db1f27ed57b031990835214f031b01a6d4f91c71fc9f3e5545620bfbe" exitCode=0 Feb 24 03:24:56 crc kubenswrapper[4923]: I0224 03:24:56.228129 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp" event={"ID":"140f3efa-43c3-4d0b-a738-fc87e216c13b","Type":"ContainerDied","Data":"b976ee3db1f27ed57b031990835214f031b01a6d4f91c71fc9f3e5545620bfbe"} Feb 24 03:24:57 crc kubenswrapper[4923]: I0224 03:24:57.686804 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp" Feb 24 03:24:57 crc kubenswrapper[4923]: I0224 03:24:57.779124 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d76z9\" (UniqueName: \"kubernetes.io/projected/140f3efa-43c3-4d0b-a738-fc87e216c13b-kube-api-access-d76z9\") pod \"140f3efa-43c3-4d0b-a738-fc87e216c13b\" (UID: \"140f3efa-43c3-4d0b-a738-fc87e216c13b\") " Feb 24 03:24:57 crc kubenswrapper[4923]: I0224 03:24:57.779174 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/140f3efa-43c3-4d0b-a738-fc87e216c13b-inventory\") pod \"140f3efa-43c3-4d0b-a738-fc87e216c13b\" (UID: \"140f3efa-43c3-4d0b-a738-fc87e216c13b\") " Feb 24 03:24:57 crc kubenswrapper[4923]: I0224 03:24:57.779284 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/140f3efa-43c3-4d0b-a738-fc87e216c13b-ssh-key-openstack-edpm-ipam\") pod \"140f3efa-43c3-4d0b-a738-fc87e216c13b\" (UID: \"140f3efa-43c3-4d0b-a738-fc87e216c13b\") " Feb 24 03:24:57 crc kubenswrapper[4923]: I0224 03:24:57.785173 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/140f3efa-43c3-4d0b-a738-fc87e216c13b-kube-api-access-d76z9" (OuterVolumeSpecName: "kube-api-access-d76z9") pod "140f3efa-43c3-4d0b-a738-fc87e216c13b" (UID: "140f3efa-43c3-4d0b-a738-fc87e216c13b"). InnerVolumeSpecName "kube-api-access-d76z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:24:57 crc kubenswrapper[4923]: E0224 03:24:57.808202 4923 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/140f3efa-43c3-4d0b-a738-fc87e216c13b-inventory podName:140f3efa-43c3-4d0b-a738-fc87e216c13b nodeName:}" failed. No retries permitted until 2026-02-24 03:24:58.308177343 +0000 UTC m=+1822.325248156 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/140f3efa-43c3-4d0b-a738-fc87e216c13b-inventory") pod "140f3efa-43c3-4d0b-a738-fc87e216c13b" (UID: "140f3efa-43c3-4d0b-a738-fc87e216c13b") : error deleting /var/lib/kubelet/pods/140f3efa-43c3-4d0b-a738-fc87e216c13b/volume-subpaths: remove /var/lib/kubelet/pods/140f3efa-43c3-4d0b-a738-fc87e216c13b/volume-subpaths: no such file or directory Feb 24 03:24:57 crc kubenswrapper[4923]: I0224 03:24:57.810922 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/140f3efa-43c3-4d0b-a738-fc87e216c13b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "140f3efa-43c3-4d0b-a738-fc87e216c13b" (UID: "140f3efa-43c3-4d0b-a738-fc87e216c13b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:24:57 crc kubenswrapper[4923]: I0224 03:24:57.881098 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d76z9\" (UniqueName: \"kubernetes.io/projected/140f3efa-43c3-4d0b-a738-fc87e216c13b-kube-api-access-d76z9\") on node \"crc\" DevicePath \"\"" Feb 24 03:24:57 crc kubenswrapper[4923]: I0224 03:24:57.881132 4923 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/140f3efa-43c3-4d0b-a738-fc87e216c13b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.246081 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp" event={"ID":"140f3efa-43c3-4d0b-a738-fc87e216c13b","Type":"ContainerDied","Data":"007af0dea1f34177e6bad46933770ce0c4bb2b69bf4c5b2650c7733f0af3eaee"} Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.246118 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="007af0dea1f34177e6bad46933770ce0c4bb2b69bf4c5b2650c7733f0af3eaee" Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.246167 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-sg2sp" Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.368987 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc"] Feb 24 03:24:58 crc kubenswrapper[4923]: E0224 03:24:58.369471 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="140f3efa-43c3-4d0b-a738-fc87e216c13b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.369494 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="140f3efa-43c3-4d0b-a738-fc87e216c13b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.369718 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="140f3efa-43c3-4d0b-a738-fc87e216c13b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.370681 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc" Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.381559 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc"] Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.394373 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/140f3efa-43c3-4d0b-a738-fc87e216c13b-inventory\") pod \"140f3efa-43c3-4d0b-a738-fc87e216c13b\" (UID: \"140f3efa-43c3-4d0b-a738-fc87e216c13b\") " Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.399476 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/140f3efa-43c3-4d0b-a738-fc87e216c13b-inventory" (OuterVolumeSpecName: "inventory") pod "140f3efa-43c3-4d0b-a738-fc87e216c13b" (UID: "140f3efa-43c3-4d0b-a738-fc87e216c13b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.496860 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec71f0e3-4ff0-46b6-a887-37132374b80c-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-smdtc\" (UID: \"ec71f0e3-4ff0-46b6-a887-37132374b80c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc" Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.497067 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk7d2\" (UniqueName: \"kubernetes.io/projected/ec71f0e3-4ff0-46b6-a887-37132374b80c-kube-api-access-tk7d2\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-smdtc\" (UID: \"ec71f0e3-4ff0-46b6-a887-37132374b80c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc" Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.497148 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec71f0e3-4ff0-46b6-a887-37132374b80c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-smdtc\" (UID: \"ec71f0e3-4ff0-46b6-a887-37132374b80c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc" Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.497261 4923 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/140f3efa-43c3-4d0b-a738-fc87e216c13b-inventory\") on node \"crc\" DevicePath \"\"" Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.598396 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk7d2\" (UniqueName: \"kubernetes.io/projected/ec71f0e3-4ff0-46b6-a887-37132374b80c-kube-api-access-tk7d2\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-smdtc\" (UID: \"ec71f0e3-4ff0-46b6-a887-37132374b80c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc" Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.598477 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec71f0e3-4ff0-46b6-a887-37132374b80c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-smdtc\" (UID: \"ec71f0e3-4ff0-46b6-a887-37132374b80c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc" Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.598533 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec71f0e3-4ff0-46b6-a887-37132374b80c-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-smdtc\" (UID: \"ec71f0e3-4ff0-46b6-a887-37132374b80c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc" Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.602486 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec71f0e3-4ff0-46b6-a887-37132374b80c-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-smdtc\" (UID: \"ec71f0e3-4ff0-46b6-a887-37132374b80c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc" Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.602999 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec71f0e3-4ff0-46b6-a887-37132374b80c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-smdtc\" (UID: \"ec71f0e3-4ff0-46b6-a887-37132374b80c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc" Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.616646 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk7d2\" (UniqueName: \"kubernetes.io/projected/ec71f0e3-4ff0-46b6-a887-37132374b80c-kube-api-access-tk7d2\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-smdtc\" (UID: \"ec71f0e3-4ff0-46b6-a887-37132374b80c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc" Feb 24 03:24:58 crc kubenswrapper[4923]: I0224 03:24:58.697214 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc" Feb 24 03:24:59 crc kubenswrapper[4923]: I0224 03:24:59.229894 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc"] Feb 24 03:24:59 crc kubenswrapper[4923]: I0224 03:24:59.255272 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc" event={"ID":"ec71f0e3-4ff0-46b6-a887-37132374b80c","Type":"ContainerStarted","Data":"d4766cfe463a102b5d535aa8de06b87164eccc6c8facd4b2f7b08c3aba1305d6"} Feb 24 03:25:00 crc kubenswrapper[4923]: I0224 03:25:00.266600 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc" event={"ID":"ec71f0e3-4ff0-46b6-a887-37132374b80c","Type":"ContainerStarted","Data":"5bd93f74c0d96b571d8f55ede2292d38485b7206c00e45cc5c380d911acf1fdc"} Feb 24 03:25:00 crc kubenswrapper[4923]: I0224 03:25:00.294292 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc" podStartSLOduration=1.8020275940000001 podStartE2EDuration="2.294272463s" podCreationTimestamp="2026-02-24 03:24:58 +0000 UTC" firstStartedPulling="2026-02-24 03:24:59.237187711 +0000 UTC m=+1823.254258524" lastFinishedPulling="2026-02-24 03:24:59.72943258 +0000 UTC m=+1823.746503393" observedRunningTime="2026-02-24 03:25:00.287079195 +0000 UTC m=+1824.304150008" watchObservedRunningTime="2026-02-24 03:25:00.294272463 +0000 UTC m=+1824.311343276" Feb 24 03:25:00 crc kubenswrapper[4923]: I0224 03:25:00.713049 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:25:00 crc kubenswrapper[4923]: E0224 03:25:00.713552 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:25:14 crc kubenswrapper[4923]: I0224 03:25:14.713064 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:25:14 crc kubenswrapper[4923]: E0224 03:25:14.714099 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:25:29 crc kubenswrapper[4923]: I0224 03:25:29.713728 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:25:29 crc kubenswrapper[4923]: E0224 03:25:29.714588 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:25:43 crc kubenswrapper[4923]: I0224 03:25:43.719609 4923 scope.go:117] "RemoveContainer" containerID="a6d81afc0f1ea5db3c8e12dd2b3757d1cf37f7e3b919a57b2128d42f99e88eba" Feb 24 03:25:44 crc kubenswrapper[4923]: I0224 03:25:44.713574 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:25:44 crc kubenswrapper[4923]: E0224 03:25:44.714406 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:25:46 crc kubenswrapper[4923]: I0224 03:25:46.065601 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-clrmb"] Feb 24 03:25:46 crc kubenswrapper[4923]: I0224 03:25:46.086143 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-clrmb"] Feb 24 03:25:47 crc kubenswrapper[4923]: I0224 03:25:47.032702 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gpqwz"] Feb 24 03:25:47 crc kubenswrapper[4923]: I0224 03:25:47.043061 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gpqwz"] Feb 24 03:25:47 crc kubenswrapper[4923]: I0224 03:25:47.723177 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="557d4e5b-4b4a-4eee-b199-533822f52b8f" path="/var/lib/kubelet/pods/557d4e5b-4b4a-4eee-b199-533822f52b8f/volumes" Feb 24 03:25:47 crc kubenswrapper[4923]: I0224 03:25:47.724033 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5631e12f-8fa0-49b5-b6e8-7b0193f2a419" path="/var/lib/kubelet/pods/5631e12f-8fa0-49b5-b6e8-7b0193f2a419/volumes" Feb 24 03:25:49 crc kubenswrapper[4923]: I0224 03:25:49.761154 4923 generic.go:334] "Generic (PLEG): container finished" podID="ec71f0e3-4ff0-46b6-a887-37132374b80c" containerID="5bd93f74c0d96b571d8f55ede2292d38485b7206c00e45cc5c380d911acf1fdc" exitCode=0 Feb 24 03:25:49 crc kubenswrapper[4923]: I0224 03:25:49.761258 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc" event={"ID":"ec71f0e3-4ff0-46b6-a887-37132374b80c","Type":"ContainerDied","Data":"5bd93f74c0d96b571d8f55ede2292d38485b7206c00e45cc5c380d911acf1fdc"} Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.197522 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.286090 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec71f0e3-4ff0-46b6-a887-37132374b80c-inventory\") pod \"ec71f0e3-4ff0-46b6-a887-37132374b80c\" (UID: \"ec71f0e3-4ff0-46b6-a887-37132374b80c\") " Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.286621 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec71f0e3-4ff0-46b6-a887-37132374b80c-ssh-key-openstack-edpm-ipam\") pod \"ec71f0e3-4ff0-46b6-a887-37132374b80c\" (UID: \"ec71f0e3-4ff0-46b6-a887-37132374b80c\") " Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.286745 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk7d2\" (UniqueName: \"kubernetes.io/projected/ec71f0e3-4ff0-46b6-a887-37132374b80c-kube-api-access-tk7d2\") pod \"ec71f0e3-4ff0-46b6-a887-37132374b80c\" (UID: \"ec71f0e3-4ff0-46b6-a887-37132374b80c\") " Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.296499 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec71f0e3-4ff0-46b6-a887-37132374b80c-kube-api-access-tk7d2" (OuterVolumeSpecName: "kube-api-access-tk7d2") pod "ec71f0e3-4ff0-46b6-a887-37132374b80c" (UID: "ec71f0e3-4ff0-46b6-a887-37132374b80c"). InnerVolumeSpecName "kube-api-access-tk7d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.311461 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec71f0e3-4ff0-46b6-a887-37132374b80c-inventory" (OuterVolumeSpecName: "inventory") pod "ec71f0e3-4ff0-46b6-a887-37132374b80c" (UID: "ec71f0e3-4ff0-46b6-a887-37132374b80c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.311812 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec71f0e3-4ff0-46b6-a887-37132374b80c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ec71f0e3-4ff0-46b6-a887-37132374b80c" (UID: "ec71f0e3-4ff0-46b6-a887-37132374b80c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.389106 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk7d2\" (UniqueName: \"kubernetes.io/projected/ec71f0e3-4ff0-46b6-a887-37132374b80c-kube-api-access-tk7d2\") on node \"crc\" DevicePath \"\"" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.389152 4923 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec71f0e3-4ff0-46b6-a887-37132374b80c-inventory\") on node \"crc\" DevicePath \"\"" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.389163 4923 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec71f0e3-4ff0-46b6-a887-37132374b80c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.781003 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc" event={"ID":"ec71f0e3-4ff0-46b6-a887-37132374b80c","Type":"ContainerDied","Data":"d4766cfe463a102b5d535aa8de06b87164eccc6c8facd4b2f7b08c3aba1305d6"} Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.781047 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4766cfe463a102b5d535aa8de06b87164eccc6c8facd4b2f7b08c3aba1305d6" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.781053 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smdtc" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.872532 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-m2jt2"] Feb 24 03:25:51 crc kubenswrapper[4923]: E0224 03:25:51.872967 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec71f0e3-4ff0-46b6-a887-37132374b80c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.872994 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec71f0e3-4ff0-46b6-a887-37132374b80c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.873236 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec71f0e3-4ff0-46b6-a887-37132374b80c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.873995 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-m2jt2" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.876589 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.876772 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.876905 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fgpt8" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.881636 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.887610 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-m2jt2"] Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.999263 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v575l\" (UniqueName: \"kubernetes.io/projected/b51afcd9-da3d-4f68-947a-c6af0a02cfaa-kube-api-access-v575l\") pod \"ssh-known-hosts-edpm-deployment-m2jt2\" (UID: \"b51afcd9-da3d-4f68-947a-c6af0a02cfaa\") " pod="openstack/ssh-known-hosts-edpm-deployment-m2jt2" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.999501 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b51afcd9-da3d-4f68-947a-c6af0a02cfaa-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-m2jt2\" (UID: \"b51afcd9-da3d-4f68-947a-c6af0a02cfaa\") " pod="openstack/ssh-known-hosts-edpm-deployment-m2jt2" Feb 24 03:25:51 crc kubenswrapper[4923]: I0224 03:25:51.999610 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51afcd9-da3d-4f68-947a-c6af0a02cfaa-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-m2jt2\" (UID: \"b51afcd9-da3d-4f68-947a-c6af0a02cfaa\") " pod="openstack/ssh-known-hosts-edpm-deployment-m2jt2" Feb 24 03:25:52 crc kubenswrapper[4923]: I0224 03:25:52.102704 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v575l\" (UniqueName: \"kubernetes.io/projected/b51afcd9-da3d-4f68-947a-c6af0a02cfaa-kube-api-access-v575l\") pod \"ssh-known-hosts-edpm-deployment-m2jt2\" (UID: \"b51afcd9-da3d-4f68-947a-c6af0a02cfaa\") " pod="openstack/ssh-known-hosts-edpm-deployment-m2jt2" Feb 24 03:25:52 crc kubenswrapper[4923]: I0224 03:25:52.102953 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b51afcd9-da3d-4f68-947a-c6af0a02cfaa-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-m2jt2\" (UID: \"b51afcd9-da3d-4f68-947a-c6af0a02cfaa\") " pod="openstack/ssh-known-hosts-edpm-deployment-m2jt2" Feb 24 03:25:52 crc kubenswrapper[4923]: I0224 03:25:52.103064 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51afcd9-da3d-4f68-947a-c6af0a02cfaa-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-m2jt2\" (UID: \"b51afcd9-da3d-4f68-947a-c6af0a02cfaa\") " pod="openstack/ssh-known-hosts-edpm-deployment-m2jt2" Feb 24 03:25:52 crc kubenswrapper[4923]: I0224 03:25:52.108800 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b51afcd9-da3d-4f68-947a-c6af0a02cfaa-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-m2jt2\" (UID: \"b51afcd9-da3d-4f68-947a-c6af0a02cfaa\") " pod="openstack/ssh-known-hosts-edpm-deployment-m2jt2" Feb 24 03:25:52 crc kubenswrapper[4923]: I0224 03:25:52.119674 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51afcd9-da3d-4f68-947a-c6af0a02cfaa-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-m2jt2\" (UID: \"b51afcd9-da3d-4f68-947a-c6af0a02cfaa\") " pod="openstack/ssh-known-hosts-edpm-deployment-m2jt2" Feb 24 03:25:52 crc kubenswrapper[4923]: I0224 03:25:52.127118 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v575l\" (UniqueName: \"kubernetes.io/projected/b51afcd9-da3d-4f68-947a-c6af0a02cfaa-kube-api-access-v575l\") pod \"ssh-known-hosts-edpm-deployment-m2jt2\" (UID: \"b51afcd9-da3d-4f68-947a-c6af0a02cfaa\") " pod="openstack/ssh-known-hosts-edpm-deployment-m2jt2" Feb 24 03:25:52 crc kubenswrapper[4923]: I0224 03:25:52.192086 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-m2jt2" Feb 24 03:25:52 crc kubenswrapper[4923]: I0224 03:25:52.776927 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-m2jt2"] Feb 24 03:25:52 crc kubenswrapper[4923]: I0224 03:25:52.794275 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-m2jt2" event={"ID":"b51afcd9-da3d-4f68-947a-c6af0a02cfaa","Type":"ContainerStarted","Data":"e44e8c37785868960c67a164d41cced7639515e675afa9c0808af2416ee2e7dc"} Feb 24 03:25:53 crc kubenswrapper[4923]: I0224 03:25:53.806801 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-m2jt2" event={"ID":"b51afcd9-da3d-4f68-947a-c6af0a02cfaa","Type":"ContainerStarted","Data":"b5dec6461692d3c3287affa13f77db1fca3b2c4bd0485bd3eb6a8494c1daf8c6"} Feb 24 03:25:53 crc kubenswrapper[4923]: I0224 03:25:53.846839 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-m2jt2" podStartSLOduration=2.355675031 podStartE2EDuration="2.846820531s" podCreationTimestamp="2026-02-24 03:25:51 +0000 UTC" firstStartedPulling="2026-02-24 03:25:52.785989132 +0000 UTC m=+1876.803059985" lastFinishedPulling="2026-02-24 03:25:53.277134672 +0000 UTC m=+1877.294205485" observedRunningTime="2026-02-24 03:25:53.832374434 +0000 UTC m=+1877.849445257" watchObservedRunningTime="2026-02-24 03:25:53.846820531 +0000 UTC m=+1877.863891354" Feb 24 03:25:55 crc kubenswrapper[4923]: I0224 03:25:55.712517 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:25:55 crc kubenswrapper[4923]: E0224 03:25:55.713024 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:26:00 crc kubenswrapper[4923]: I0224 03:26:00.878091 4923 generic.go:334] "Generic (PLEG): container finished" podID="b51afcd9-da3d-4f68-947a-c6af0a02cfaa" containerID="b5dec6461692d3c3287affa13f77db1fca3b2c4bd0485bd3eb6a8494c1daf8c6" exitCode=0 Feb 24 03:26:00 crc kubenswrapper[4923]: I0224 03:26:00.878212 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-m2jt2" event={"ID":"b51afcd9-da3d-4f68-947a-c6af0a02cfaa","Type":"ContainerDied","Data":"b5dec6461692d3c3287affa13f77db1fca3b2c4bd0485bd3eb6a8494c1daf8c6"} Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.316155 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-m2jt2" Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.478708 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v575l\" (UniqueName: \"kubernetes.io/projected/b51afcd9-da3d-4f68-947a-c6af0a02cfaa-kube-api-access-v575l\") pod \"b51afcd9-da3d-4f68-947a-c6af0a02cfaa\" (UID: \"b51afcd9-da3d-4f68-947a-c6af0a02cfaa\") " Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.478798 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51afcd9-da3d-4f68-947a-c6af0a02cfaa-ssh-key-openstack-edpm-ipam\") pod \"b51afcd9-da3d-4f68-947a-c6af0a02cfaa\" (UID: \"b51afcd9-da3d-4f68-947a-c6af0a02cfaa\") " Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.478827 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b51afcd9-da3d-4f68-947a-c6af0a02cfaa-inventory-0\") pod \"b51afcd9-da3d-4f68-947a-c6af0a02cfaa\" (UID: \"b51afcd9-da3d-4f68-947a-c6af0a02cfaa\") " Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.485694 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b51afcd9-da3d-4f68-947a-c6af0a02cfaa-kube-api-access-v575l" (OuterVolumeSpecName: "kube-api-access-v575l") pod "b51afcd9-da3d-4f68-947a-c6af0a02cfaa" (UID: "b51afcd9-da3d-4f68-947a-c6af0a02cfaa"). InnerVolumeSpecName "kube-api-access-v575l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.532151 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b51afcd9-da3d-4f68-947a-c6af0a02cfaa-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "b51afcd9-da3d-4f68-947a-c6af0a02cfaa" (UID: "b51afcd9-da3d-4f68-947a-c6af0a02cfaa"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.534227 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b51afcd9-da3d-4f68-947a-c6af0a02cfaa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b51afcd9-da3d-4f68-947a-c6af0a02cfaa" (UID: "b51afcd9-da3d-4f68-947a-c6af0a02cfaa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.582069 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v575l\" (UniqueName: \"kubernetes.io/projected/b51afcd9-da3d-4f68-947a-c6af0a02cfaa-kube-api-access-v575l\") on node \"crc\" DevicePath \"\"" Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.582106 4923 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51afcd9-da3d-4f68-947a-c6af0a02cfaa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.582121 4923 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b51afcd9-da3d-4f68-947a-c6af0a02cfaa-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.919487 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-m2jt2" Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.919399 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-m2jt2" event={"ID":"b51afcd9-da3d-4f68-947a-c6af0a02cfaa","Type":"ContainerDied","Data":"e44e8c37785868960c67a164d41cced7639515e675afa9c0808af2416ee2e7dc"} Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.929659 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e44e8c37785868960c67a164d41cced7639515e675afa9c0808af2416ee2e7dc" Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.968241 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm"] Feb 24 03:26:02 crc kubenswrapper[4923]: E0224 03:26:02.968769 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b51afcd9-da3d-4f68-947a-c6af0a02cfaa" containerName="ssh-known-hosts-edpm-deployment" Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.968791 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="b51afcd9-da3d-4f68-947a-c6af0a02cfaa" containerName="ssh-known-hosts-edpm-deployment" Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.969060 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="b51afcd9-da3d-4f68-947a-c6af0a02cfaa" containerName="ssh-known-hosts-edpm-deployment" Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.969962 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm" Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.975183 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.975428 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.975562 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fgpt8" Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.975732 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 24 03:26:02 crc kubenswrapper[4923]: I0224 03:26:02.979534 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm"] Feb 24 03:26:03 crc kubenswrapper[4923]: I0224 03:26:03.092548 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb5t7\" (UniqueName: \"kubernetes.io/projected/a784fe18-eb1d-4e0d-84cb-9268b1904302-kube-api-access-nb5t7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f4pqm\" (UID: \"a784fe18-eb1d-4e0d-84cb-9268b1904302\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm" Feb 24 03:26:03 crc kubenswrapper[4923]: I0224 03:26:03.092629 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a784fe18-eb1d-4e0d-84cb-9268b1904302-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f4pqm\" (UID: \"a784fe18-eb1d-4e0d-84cb-9268b1904302\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm" Feb 24 03:26:03 crc kubenswrapper[4923]: I0224 03:26:03.092683 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a784fe18-eb1d-4e0d-84cb-9268b1904302-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f4pqm\" (UID: \"a784fe18-eb1d-4e0d-84cb-9268b1904302\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm" Feb 24 03:26:03 crc kubenswrapper[4923]: I0224 03:26:03.195563 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a784fe18-eb1d-4e0d-84cb-9268b1904302-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f4pqm\" (UID: \"a784fe18-eb1d-4e0d-84cb-9268b1904302\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm" Feb 24 03:26:03 crc kubenswrapper[4923]: I0224 03:26:03.196100 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb5t7\" (UniqueName: \"kubernetes.io/projected/a784fe18-eb1d-4e0d-84cb-9268b1904302-kube-api-access-nb5t7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f4pqm\" (UID: \"a784fe18-eb1d-4e0d-84cb-9268b1904302\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm" Feb 24 03:26:03 crc kubenswrapper[4923]: I0224 03:26:03.196238 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a784fe18-eb1d-4e0d-84cb-9268b1904302-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f4pqm\" (UID: \"a784fe18-eb1d-4e0d-84cb-9268b1904302\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm" Feb 24 03:26:03 crc kubenswrapper[4923]: I0224 03:26:03.200042 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a784fe18-eb1d-4e0d-84cb-9268b1904302-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f4pqm\" (UID: \"a784fe18-eb1d-4e0d-84cb-9268b1904302\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm" Feb 24 03:26:03 crc kubenswrapper[4923]: I0224 03:26:03.205936 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a784fe18-eb1d-4e0d-84cb-9268b1904302-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f4pqm\" (UID: \"a784fe18-eb1d-4e0d-84cb-9268b1904302\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm" Feb 24 03:26:03 crc kubenswrapper[4923]: I0224 03:26:03.217073 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb5t7\" (UniqueName: \"kubernetes.io/projected/a784fe18-eb1d-4e0d-84cb-9268b1904302-kube-api-access-nb5t7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-f4pqm\" (UID: \"a784fe18-eb1d-4e0d-84cb-9268b1904302\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm" Feb 24 03:26:03 crc kubenswrapper[4923]: I0224 03:26:03.286505 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm" Feb 24 03:26:03 crc kubenswrapper[4923]: I0224 03:26:03.835350 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm"] Feb 24 03:26:03 crc kubenswrapper[4923]: W0224 03:26:03.835870 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda784fe18_eb1d_4e0d_84cb_9268b1904302.slice/crio-6fc8512059a9e1b94fedc8272c5732b0195648407c8379bd53d2d5618e2d2d5e WatchSource:0}: Error finding container 6fc8512059a9e1b94fedc8272c5732b0195648407c8379bd53d2d5618e2d2d5e: Status 404 returned error can't find the container with id 6fc8512059a9e1b94fedc8272c5732b0195648407c8379bd53d2d5618e2d2d5e Feb 24 03:26:03 crc kubenswrapper[4923]: I0224 03:26:03.931801 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm" event={"ID":"a784fe18-eb1d-4e0d-84cb-9268b1904302","Type":"ContainerStarted","Data":"6fc8512059a9e1b94fedc8272c5732b0195648407c8379bd53d2d5618e2d2d5e"} Feb 24 03:26:04 crc kubenswrapper[4923]: I0224 03:26:04.943188 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm" event={"ID":"a784fe18-eb1d-4e0d-84cb-9268b1904302","Type":"ContainerStarted","Data":"85128b5e7bf55d797b38b59581ab0ef459985c8b6efddb94d0e31bc3c0b4bff5"} Feb 24 03:26:04 crc kubenswrapper[4923]: I0224 03:26:04.964922 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm" podStartSLOduration=2.535792336 podStartE2EDuration="2.964900297s" podCreationTimestamp="2026-02-24 03:26:02 +0000 UTC" firstStartedPulling="2026-02-24 03:26:03.839900892 +0000 UTC m=+1887.856971715" lastFinishedPulling="2026-02-24 03:26:04.269008843 +0000 UTC m=+1888.286079676" observedRunningTime="2026-02-24 03:26:04.963024508 +0000 UTC m=+1888.980095331" watchObservedRunningTime="2026-02-24 03:26:04.964900297 +0000 UTC m=+1888.981971120" Feb 24 03:26:06 crc kubenswrapper[4923]: I0224 03:26:06.713856 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:26:06 crc kubenswrapper[4923]: E0224 03:26:06.714629 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:26:13 crc kubenswrapper[4923]: I0224 03:26:13.026875 4923 generic.go:334] "Generic (PLEG): container finished" podID="a784fe18-eb1d-4e0d-84cb-9268b1904302" containerID="85128b5e7bf55d797b38b59581ab0ef459985c8b6efddb94d0e31bc3c0b4bff5" exitCode=0 Feb 24 03:26:13 crc kubenswrapper[4923]: I0224 03:26:13.027086 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm" event={"ID":"a784fe18-eb1d-4e0d-84cb-9268b1904302","Type":"ContainerDied","Data":"85128b5e7bf55d797b38b59581ab0ef459985c8b6efddb94d0e31bc3c0b4bff5"} Feb 24 03:26:14 crc kubenswrapper[4923]: I0224 03:26:14.555768 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm" Feb 24 03:26:14 crc kubenswrapper[4923]: I0224 03:26:14.640721 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a784fe18-eb1d-4e0d-84cb-9268b1904302-ssh-key-openstack-edpm-ipam\") pod \"a784fe18-eb1d-4e0d-84cb-9268b1904302\" (UID: \"a784fe18-eb1d-4e0d-84cb-9268b1904302\") " Feb 24 03:26:14 crc kubenswrapper[4923]: I0224 03:26:14.641038 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb5t7\" (UniqueName: \"kubernetes.io/projected/a784fe18-eb1d-4e0d-84cb-9268b1904302-kube-api-access-nb5t7\") pod \"a784fe18-eb1d-4e0d-84cb-9268b1904302\" (UID: \"a784fe18-eb1d-4e0d-84cb-9268b1904302\") " Feb 24 03:26:14 crc kubenswrapper[4923]: I0224 03:26:14.641056 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a784fe18-eb1d-4e0d-84cb-9268b1904302-inventory\") pod \"a784fe18-eb1d-4e0d-84cb-9268b1904302\" (UID: \"a784fe18-eb1d-4e0d-84cb-9268b1904302\") " Feb 24 03:26:14 crc kubenswrapper[4923]: I0224 03:26:14.648542 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a784fe18-eb1d-4e0d-84cb-9268b1904302-kube-api-access-nb5t7" (OuterVolumeSpecName: "kube-api-access-nb5t7") pod "a784fe18-eb1d-4e0d-84cb-9268b1904302" (UID: "a784fe18-eb1d-4e0d-84cb-9268b1904302"). InnerVolumeSpecName "kube-api-access-nb5t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:26:14 crc kubenswrapper[4923]: I0224 03:26:14.666795 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a784fe18-eb1d-4e0d-84cb-9268b1904302-inventory" (OuterVolumeSpecName: "inventory") pod "a784fe18-eb1d-4e0d-84cb-9268b1904302" (UID: "a784fe18-eb1d-4e0d-84cb-9268b1904302"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:26:14 crc kubenswrapper[4923]: I0224 03:26:14.677878 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a784fe18-eb1d-4e0d-84cb-9268b1904302-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a784fe18-eb1d-4e0d-84cb-9268b1904302" (UID: "a784fe18-eb1d-4e0d-84cb-9268b1904302"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:26:14 crc kubenswrapper[4923]: I0224 03:26:14.743241 4923 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a784fe18-eb1d-4e0d-84cb-9268b1904302-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 24 03:26:14 crc kubenswrapper[4923]: I0224 03:26:14.743270 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb5t7\" (UniqueName: \"kubernetes.io/projected/a784fe18-eb1d-4e0d-84cb-9268b1904302-kube-api-access-nb5t7\") on node \"crc\" DevicePath \"\"" Feb 24 03:26:14 crc kubenswrapper[4923]: I0224 03:26:14.743280 4923 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a784fe18-eb1d-4e0d-84cb-9268b1904302-inventory\") on node \"crc\" DevicePath \"\"" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.053798 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm" event={"ID":"a784fe18-eb1d-4e0d-84cb-9268b1904302","Type":"ContainerDied","Data":"6fc8512059a9e1b94fedc8272c5732b0195648407c8379bd53d2d5618e2d2d5e"} Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.053850 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fc8512059a9e1b94fedc8272c5732b0195648407c8379bd53d2d5618e2d2d5e" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.053899 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-f4pqm" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.138029 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg"] Feb 24 03:26:15 crc kubenswrapper[4923]: E0224 03:26:15.138450 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a784fe18-eb1d-4e0d-84cb-9268b1904302" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.138473 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="a784fe18-eb1d-4e0d-84cb-9268b1904302" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.138660 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="a784fe18-eb1d-4e0d-84cb-9268b1904302" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.139432 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.142011 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fgpt8" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.142239 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.142449 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.143648 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.153402 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsjpv\" (UniqueName: \"kubernetes.io/projected/3444189b-88ae-469b-810d-e92a9a0c17d8-kube-api-access-jsjpv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-976cg\" (UID: \"3444189b-88ae-469b-810d-e92a9a0c17d8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.153516 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3444189b-88ae-469b-810d-e92a9a0c17d8-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-976cg\" (UID: \"3444189b-88ae-469b-810d-e92a9a0c17d8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.153565 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3444189b-88ae-469b-810d-e92a9a0c17d8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-976cg\" (UID: \"3444189b-88ae-469b-810d-e92a9a0c17d8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.195181 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg"] Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.255850 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3444189b-88ae-469b-810d-e92a9a0c17d8-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-976cg\" (UID: \"3444189b-88ae-469b-810d-e92a9a0c17d8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.255910 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3444189b-88ae-469b-810d-e92a9a0c17d8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-976cg\" (UID: \"3444189b-88ae-469b-810d-e92a9a0c17d8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.256035 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsjpv\" (UniqueName: \"kubernetes.io/projected/3444189b-88ae-469b-810d-e92a9a0c17d8-kube-api-access-jsjpv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-976cg\" (UID: \"3444189b-88ae-469b-810d-e92a9a0c17d8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.264212 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3444189b-88ae-469b-810d-e92a9a0c17d8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-976cg\" (UID: \"3444189b-88ae-469b-810d-e92a9a0c17d8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.264259 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3444189b-88ae-469b-810d-e92a9a0c17d8-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-976cg\" (UID: \"3444189b-88ae-469b-810d-e92a9a0c17d8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.275883 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsjpv\" (UniqueName: \"kubernetes.io/projected/3444189b-88ae-469b-810d-e92a9a0c17d8-kube-api-access-jsjpv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-976cg\" (UID: \"3444189b-88ae-469b-810d-e92a9a0c17d8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.478105 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg" Feb 24 03:26:15 crc kubenswrapper[4923]: I0224 03:26:15.978613 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg"] Feb 24 03:26:16 crc kubenswrapper[4923]: I0224 03:26:16.064771 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg" event={"ID":"3444189b-88ae-469b-810d-e92a9a0c17d8","Type":"ContainerStarted","Data":"e6e7746456a3d7b74551bdd0f124cf73c2a1700b519e58262e8ea813ab7fb5ca"} Feb 24 03:26:17 crc kubenswrapper[4923]: I0224 03:26:17.075975 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg" event={"ID":"3444189b-88ae-469b-810d-e92a9a0c17d8","Type":"ContainerStarted","Data":"af1d54e6d0ea756bf0d64e11633b12095c7cf5bb6f0a4e22f2a033724aa0db12"} Feb 24 03:26:17 crc kubenswrapper[4923]: I0224 03:26:17.096606 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg" podStartSLOduration=1.401039764 podStartE2EDuration="2.096587499s" podCreationTimestamp="2026-02-24 03:26:15 +0000 UTC" firstStartedPulling="2026-02-24 03:26:15.988550938 +0000 UTC m=+1900.005621781" lastFinishedPulling="2026-02-24 03:26:16.684098703 +0000 UTC m=+1900.701169516" observedRunningTime="2026-02-24 03:26:17.094459954 +0000 UTC m=+1901.111530777" watchObservedRunningTime="2026-02-24 03:26:17.096587499 +0000 UTC m=+1901.113658312" Feb 24 03:26:17 crc kubenswrapper[4923]: I0224 03:26:17.733888 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:26:17 crc kubenswrapper[4923]: E0224 03:26:17.734375 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:26:27 crc kubenswrapper[4923]: I0224 03:26:27.186683 4923 generic.go:334] "Generic (PLEG): container finished" podID="3444189b-88ae-469b-810d-e92a9a0c17d8" containerID="af1d54e6d0ea756bf0d64e11633b12095c7cf5bb6f0a4e22f2a033724aa0db12" exitCode=0 Feb 24 03:26:27 crc kubenswrapper[4923]: I0224 03:26:27.186783 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg" event={"ID":"3444189b-88ae-469b-810d-e92a9a0c17d8","Type":"ContainerDied","Data":"af1d54e6d0ea756bf0d64e11633b12095c7cf5bb6f0a4e22f2a033724aa0db12"} Feb 24 03:26:28 crc kubenswrapper[4923]: I0224 03:26:28.713334 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:26:28 crc kubenswrapper[4923]: E0224 03:26:28.714283 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:26:28 crc kubenswrapper[4923]: I0224 03:26:28.781505 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg" Feb 24 03:26:28 crc kubenswrapper[4923]: I0224 03:26:28.877519 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3444189b-88ae-469b-810d-e92a9a0c17d8-inventory\") pod \"3444189b-88ae-469b-810d-e92a9a0c17d8\" (UID: \"3444189b-88ae-469b-810d-e92a9a0c17d8\") " Feb 24 03:26:28 crc kubenswrapper[4923]: I0224 03:26:28.877637 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsjpv\" (UniqueName: \"kubernetes.io/projected/3444189b-88ae-469b-810d-e92a9a0c17d8-kube-api-access-jsjpv\") pod \"3444189b-88ae-469b-810d-e92a9a0c17d8\" (UID: \"3444189b-88ae-469b-810d-e92a9a0c17d8\") " Feb 24 03:26:28 crc kubenswrapper[4923]: I0224 03:26:28.877767 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3444189b-88ae-469b-810d-e92a9a0c17d8-ssh-key-openstack-edpm-ipam\") pod \"3444189b-88ae-469b-810d-e92a9a0c17d8\" (UID: \"3444189b-88ae-469b-810d-e92a9a0c17d8\") " Feb 24 03:26:28 crc kubenswrapper[4923]: I0224 03:26:28.882797 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3444189b-88ae-469b-810d-e92a9a0c17d8-kube-api-access-jsjpv" (OuterVolumeSpecName: "kube-api-access-jsjpv") pod "3444189b-88ae-469b-810d-e92a9a0c17d8" (UID: "3444189b-88ae-469b-810d-e92a9a0c17d8"). InnerVolumeSpecName "kube-api-access-jsjpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:26:28 crc kubenswrapper[4923]: I0224 03:26:28.912337 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3444189b-88ae-469b-810d-e92a9a0c17d8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3444189b-88ae-469b-810d-e92a9a0c17d8" (UID: "3444189b-88ae-469b-810d-e92a9a0c17d8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:26:28 crc kubenswrapper[4923]: I0224 03:26:28.924195 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3444189b-88ae-469b-810d-e92a9a0c17d8-inventory" (OuterVolumeSpecName: "inventory") pod "3444189b-88ae-469b-810d-e92a9a0c17d8" (UID: "3444189b-88ae-469b-810d-e92a9a0c17d8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:26:28 crc kubenswrapper[4923]: I0224 03:26:28.980668 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsjpv\" (UniqueName: \"kubernetes.io/projected/3444189b-88ae-469b-810d-e92a9a0c17d8-kube-api-access-jsjpv\") on node \"crc\" DevicePath \"\"" Feb 24 03:26:28 crc kubenswrapper[4923]: I0224 03:26:28.980745 4923 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3444189b-88ae-469b-810d-e92a9a0c17d8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 24 03:26:28 crc kubenswrapper[4923]: I0224 03:26:28.980771 4923 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3444189b-88ae-469b-810d-e92a9a0c17d8-inventory\") on node \"crc\" DevicePath \"\"" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.231576 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg" event={"ID":"3444189b-88ae-469b-810d-e92a9a0c17d8","Type":"ContainerDied","Data":"e6e7746456a3d7b74551bdd0f124cf73c2a1700b519e58262e8ea813ab7fb5ca"} Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.231642 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6e7746456a3d7b74551bdd0f124cf73c2a1700b519e58262e8ea813ab7fb5ca" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.231715 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-976cg" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.314081 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9"] Feb 24 03:26:29 crc kubenswrapper[4923]: E0224 03:26:29.314656 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3444189b-88ae-469b-810d-e92a9a0c17d8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.314681 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="3444189b-88ae-469b-810d-e92a9a0c17d8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.314890 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="3444189b-88ae-469b-810d-e92a9a0c17d8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.315712 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.318421 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.318565 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.321026 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.321581 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.322775 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.322829 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.322921 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.322782 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fgpt8" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.333977 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9"] Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.387657 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.387695 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzm76\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-kube-api-access-fzm76\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.387826 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.387869 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.387915 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.387935 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.387954 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.388067 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.388129 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.388155 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.388341 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.388537 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.388603 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.388645 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.490791 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.490882 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.490917 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.490947 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.491007 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.491061 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.491102 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.491171 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.491248 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.491292 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.491356 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.491500 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.491541 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzm76\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-kube-api-access-fzm76\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.491633 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.496193 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.496356 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.497059 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.497362 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.497541 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.498123 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.499035 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.499048 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.499978 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.500571 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.500620 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.501368 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.503675 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.518408 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzm76\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-kube-api-access-fzm76\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:29 crc kubenswrapper[4923]: I0224 03:26:29.660326 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:26:30 crc kubenswrapper[4923]: I0224 03:26:30.047002 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mvsg9"] Feb 24 03:26:30 crc kubenswrapper[4923]: I0224 03:26:30.058575 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mvsg9"] Feb 24 03:26:30 crc kubenswrapper[4923]: I0224 03:26:30.210919 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9"] Feb 24 03:26:30 crc kubenswrapper[4923]: I0224 03:26:30.244592 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" event={"ID":"f8dab472-e2b2-4eab-8ced-7eed7b1bc842","Type":"ContainerStarted","Data":"6ba3ad23ff5c96906e6acdac7394decd2ff2ef6b205b3485cd64e39e81663e48"} Feb 24 03:26:31 crc kubenswrapper[4923]: I0224 03:26:31.252160 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" event={"ID":"f8dab472-e2b2-4eab-8ced-7eed7b1bc842","Type":"ContainerStarted","Data":"237d211e1a2f0f742f868651c62e575767dfc281c35493a142eb050507f371be"} Feb 24 03:26:31 crc kubenswrapper[4923]: I0224 03:26:31.282055 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" podStartSLOduration=1.8652565490000002 podStartE2EDuration="2.282036137s" podCreationTimestamp="2026-02-24 03:26:29 +0000 UTC" firstStartedPulling="2026-02-24 03:26:30.222115762 +0000 UTC m=+1914.239186575" lastFinishedPulling="2026-02-24 03:26:30.63889535 +0000 UTC m=+1914.655966163" observedRunningTime="2026-02-24 03:26:31.276490923 +0000 UTC m=+1915.293561736" watchObservedRunningTime="2026-02-24 03:26:31.282036137 +0000 UTC m=+1915.299106950" Feb 24 03:26:31 crc kubenswrapper[4923]: I0224 03:26:31.721935 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9124ee53-9cb0-4817-b967-d22e84935a4d" path="/var/lib/kubelet/pods/9124ee53-9cb0-4817-b967-d22e84935a4d/volumes" Feb 24 03:26:39 crc kubenswrapper[4923]: I0224 03:26:39.713740 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:26:39 crc kubenswrapper[4923]: E0224 03:26:39.715009 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:26:43 crc kubenswrapper[4923]: I0224 03:26:43.824879 4923 scope.go:117] "RemoveContainer" containerID="710b2ca33276dd57b06f5554cab60da196cbafa53c2fcab67057a10b0d5b6ec4" Feb 24 03:26:43 crc kubenswrapper[4923]: I0224 03:26:43.887226 4923 scope.go:117] "RemoveContainer" containerID="b8441050e113a7b10a77459432a3766d7265f920f3e08cd3238b1acfc4589fd7" Feb 24 03:26:43 crc kubenswrapper[4923]: I0224 03:26:43.945252 4923 scope.go:117] "RemoveContainer" containerID="7c9d389928b4202b4e7b7c4b3abaa2bbde602b2a213d7606a60e186324f4751f" Feb 24 03:26:50 crc kubenswrapper[4923]: I0224 03:26:50.713094 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:26:50 crc kubenswrapper[4923]: E0224 03:26:50.713902 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:27:01 crc kubenswrapper[4923]: I0224 03:27:01.713383 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:27:01 crc kubenswrapper[4923]: E0224 03:27:01.714228 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:27:10 crc kubenswrapper[4923]: I0224 03:27:10.605592 4923 generic.go:334] "Generic (PLEG): container finished" podID="f8dab472-e2b2-4eab-8ced-7eed7b1bc842" containerID="237d211e1a2f0f742f868651c62e575767dfc281c35493a142eb050507f371be" exitCode=0 Feb 24 03:27:10 crc kubenswrapper[4923]: I0224 03:27:10.605695 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" event={"ID":"f8dab472-e2b2-4eab-8ced-7eed7b1bc842","Type":"ContainerDied","Data":"237d211e1a2f0f742f868651c62e575767dfc281c35493a142eb050507f371be"} Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.002123 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.091697 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-ovn-combined-ca-bundle\") pod \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.091816 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzm76\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-kube-api-access-fzm76\") pod \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.091888 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-ovn-default-certs-0\") pod \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.091944 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.091979 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.092016 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-neutron-metadata-combined-ca-bundle\") pod \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.092055 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-ssh-key-openstack-edpm-ipam\") pod \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.092148 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-repo-setup-combined-ca-bundle\") pod \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.092192 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-bootstrap-combined-ca-bundle\") pod \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.092254 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-libvirt-combined-ca-bundle\") pod \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.092343 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-telemetry-combined-ca-bundle\") pod \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.092458 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-nova-combined-ca-bundle\") pod \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.092503 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-inventory\") pod \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.092564 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\" (UID: \"f8dab472-e2b2-4eab-8ced-7eed7b1bc842\") " Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.100432 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f8dab472-e2b2-4eab-8ced-7eed7b1bc842" (UID: "f8dab472-e2b2-4eab-8ced-7eed7b1bc842"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.100625 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-kube-api-access-fzm76" (OuterVolumeSpecName: "kube-api-access-fzm76") pod "f8dab472-e2b2-4eab-8ced-7eed7b1bc842" (UID: "f8dab472-e2b2-4eab-8ced-7eed7b1bc842"). InnerVolumeSpecName "kube-api-access-fzm76". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.102422 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "f8dab472-e2b2-4eab-8ced-7eed7b1bc842" (UID: "f8dab472-e2b2-4eab-8ced-7eed7b1bc842"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.103422 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f8dab472-e2b2-4eab-8ced-7eed7b1bc842" (UID: "f8dab472-e2b2-4eab-8ced-7eed7b1bc842"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.103719 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f8dab472-e2b2-4eab-8ced-7eed7b1bc842" (UID: "f8dab472-e2b2-4eab-8ced-7eed7b1bc842"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.103761 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "f8dab472-e2b2-4eab-8ced-7eed7b1bc842" (UID: "f8dab472-e2b2-4eab-8ced-7eed7b1bc842"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.103803 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "f8dab472-e2b2-4eab-8ced-7eed7b1bc842" (UID: "f8dab472-e2b2-4eab-8ced-7eed7b1bc842"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.103857 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "f8dab472-e2b2-4eab-8ced-7eed7b1bc842" (UID: "f8dab472-e2b2-4eab-8ced-7eed7b1bc842"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.104001 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "f8dab472-e2b2-4eab-8ced-7eed7b1bc842" (UID: "f8dab472-e2b2-4eab-8ced-7eed7b1bc842"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.104119 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f8dab472-e2b2-4eab-8ced-7eed7b1bc842" (UID: "f8dab472-e2b2-4eab-8ced-7eed7b1bc842"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.104693 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "f8dab472-e2b2-4eab-8ced-7eed7b1bc842" (UID: "f8dab472-e2b2-4eab-8ced-7eed7b1bc842"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.115760 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "f8dab472-e2b2-4eab-8ced-7eed7b1bc842" (UID: "f8dab472-e2b2-4eab-8ced-7eed7b1bc842"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.131460 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f8dab472-e2b2-4eab-8ced-7eed7b1bc842" (UID: "f8dab472-e2b2-4eab-8ced-7eed7b1bc842"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.135529 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-inventory" (OuterVolumeSpecName: "inventory") pod "f8dab472-e2b2-4eab-8ced-7eed7b1bc842" (UID: "f8dab472-e2b2-4eab-8ced-7eed7b1bc842"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.198542 4923 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.198595 4923 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.198609 4923 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.198625 4923 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.198639 4923 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.198653 4923 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.198674 4923 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.198693 4923 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.198705 4923 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.198717 4923 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.198731 4923 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-inventory\") on node \"crc\" DevicePath \"\"" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.198743 4923 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.198756 4923 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.198768 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzm76\" (UniqueName: \"kubernetes.io/projected/f8dab472-e2b2-4eab-8ced-7eed7b1bc842-kube-api-access-fzm76\") on node \"crc\" DevicePath \"\"" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.625662 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" event={"ID":"f8dab472-e2b2-4eab-8ced-7eed7b1bc842","Type":"ContainerDied","Data":"6ba3ad23ff5c96906e6acdac7394decd2ff2ef6b205b3485cd64e39e81663e48"} Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.625718 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ba3ad23ff5c96906e6acdac7394decd2ff2ef6b205b3485cd64e39e81663e48" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.625764 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.744954 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52"] Feb 24 03:27:12 crc kubenswrapper[4923]: E0224 03:27:12.745589 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8dab472-e2b2-4eab-8ced-7eed7b1bc842" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.745616 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8dab472-e2b2-4eab-8ced-7eed7b1bc842" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.745849 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8dab472-e2b2-4eab-8ced-7eed7b1bc842" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.746552 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.750010 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.750208 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.750216 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.750511 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.750530 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fgpt8" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.759875 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52"] Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.912328 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543b3843-407e-4043-a851-4170590b5a68-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hsl52\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.912377 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/543b3843-407e-4043-a851-4170590b5a68-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hsl52\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.912499 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/543b3843-407e-4043-a851-4170590b5a68-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hsl52\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.912555 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrdxp\" (UniqueName: \"kubernetes.io/projected/543b3843-407e-4043-a851-4170590b5a68-kube-api-access-qrdxp\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hsl52\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:27:12 crc kubenswrapper[4923]: I0224 03:27:12.912579 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/543b3843-407e-4043-a851-4170590b5a68-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hsl52\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:27:13 crc kubenswrapper[4923]: I0224 03:27:13.014446 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543b3843-407e-4043-a851-4170590b5a68-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hsl52\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:27:13 crc kubenswrapper[4923]: I0224 03:27:13.014502 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/543b3843-407e-4043-a851-4170590b5a68-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hsl52\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:27:13 crc kubenswrapper[4923]: I0224 03:27:13.014645 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/543b3843-407e-4043-a851-4170590b5a68-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hsl52\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:27:13 crc kubenswrapper[4923]: I0224 03:27:13.014725 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrdxp\" (UniqueName: \"kubernetes.io/projected/543b3843-407e-4043-a851-4170590b5a68-kube-api-access-qrdxp\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hsl52\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:27:13 crc kubenswrapper[4923]: I0224 03:27:13.014757 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/543b3843-407e-4043-a851-4170590b5a68-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hsl52\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:27:13 crc kubenswrapper[4923]: I0224 03:27:13.015696 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/543b3843-407e-4043-a851-4170590b5a68-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hsl52\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:27:13 crc kubenswrapper[4923]: I0224 03:27:13.019498 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/543b3843-407e-4043-a851-4170590b5a68-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hsl52\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:27:13 crc kubenswrapper[4923]: I0224 03:27:13.019498 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/543b3843-407e-4043-a851-4170590b5a68-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hsl52\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:27:13 crc kubenswrapper[4923]: I0224 03:27:13.027283 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543b3843-407e-4043-a851-4170590b5a68-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hsl52\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:27:13 crc kubenswrapper[4923]: I0224 03:27:13.031695 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrdxp\" (UniqueName: \"kubernetes.io/projected/543b3843-407e-4043-a851-4170590b5a68-kube-api-access-qrdxp\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hsl52\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:27:13 crc kubenswrapper[4923]: I0224 03:27:13.110391 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:27:13 crc kubenswrapper[4923]: I0224 03:27:13.636729 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52"] Feb 24 03:27:14 crc kubenswrapper[4923]: I0224 03:27:14.659188 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" event={"ID":"543b3843-407e-4043-a851-4170590b5a68","Type":"ContainerStarted","Data":"573345c0a098700ae30230ab42e62e02559e6da7029299cd1b5f8f27a66169d6"} Feb 24 03:27:14 crc kubenswrapper[4923]: I0224 03:27:14.659516 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" event={"ID":"543b3843-407e-4043-a851-4170590b5a68","Type":"ContainerStarted","Data":"df63bbc3c83c763b023b7dc5677680bb6261433f1b474c17d7af1e268800635d"} Feb 24 03:27:14 crc kubenswrapper[4923]: I0224 03:27:14.683359 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" podStartSLOduration=2.231809029 podStartE2EDuration="2.683337495s" podCreationTimestamp="2026-02-24 03:27:12 +0000 UTC" firstStartedPulling="2026-02-24 03:27:13.664610486 +0000 UTC m=+1957.681681299" lastFinishedPulling="2026-02-24 03:27:14.116138952 +0000 UTC m=+1958.133209765" observedRunningTime="2026-02-24 03:27:14.678074198 +0000 UTC m=+1958.695145021" watchObservedRunningTime="2026-02-24 03:27:14.683337495 +0000 UTC m=+1958.700408358" Feb 24 03:27:15 crc kubenswrapper[4923]: I0224 03:27:15.713929 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:27:15 crc kubenswrapper[4923]: E0224 03:27:15.714391 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:27:28 crc kubenswrapper[4923]: I0224 03:27:28.713289 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:27:28 crc kubenswrapper[4923]: E0224 03:27:28.714147 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:27:40 crc kubenswrapper[4923]: I0224 03:27:40.220620 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gvncl"] Feb 24 03:27:40 crc kubenswrapper[4923]: I0224 03:27:40.223035 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gvncl" Feb 24 03:27:40 crc kubenswrapper[4923]: I0224 03:27:40.233276 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gvncl"] Feb 24 03:27:40 crc kubenswrapper[4923]: I0224 03:27:40.327070 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18639db6-7088-4b65-8001-731a94420b8b-utilities\") pod \"redhat-operators-gvncl\" (UID: \"18639db6-7088-4b65-8001-731a94420b8b\") " pod="openshift-marketplace/redhat-operators-gvncl" Feb 24 03:27:40 crc kubenswrapper[4923]: I0224 03:27:40.327152 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjd9z\" (UniqueName: \"kubernetes.io/projected/18639db6-7088-4b65-8001-731a94420b8b-kube-api-access-vjd9z\") pod \"redhat-operators-gvncl\" (UID: \"18639db6-7088-4b65-8001-731a94420b8b\") " pod="openshift-marketplace/redhat-operators-gvncl" Feb 24 03:27:40 crc kubenswrapper[4923]: I0224 03:27:40.328071 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18639db6-7088-4b65-8001-731a94420b8b-catalog-content\") pod \"redhat-operators-gvncl\" (UID: \"18639db6-7088-4b65-8001-731a94420b8b\") " pod="openshift-marketplace/redhat-operators-gvncl" Feb 24 03:27:40 crc kubenswrapper[4923]: I0224 03:27:40.429825 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18639db6-7088-4b65-8001-731a94420b8b-catalog-content\") pod \"redhat-operators-gvncl\" (UID: \"18639db6-7088-4b65-8001-731a94420b8b\") " pod="openshift-marketplace/redhat-operators-gvncl" Feb 24 03:27:40 crc kubenswrapper[4923]: I0224 03:27:40.429898 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18639db6-7088-4b65-8001-731a94420b8b-utilities\") pod \"redhat-operators-gvncl\" (UID: \"18639db6-7088-4b65-8001-731a94420b8b\") " pod="openshift-marketplace/redhat-operators-gvncl" Feb 24 03:27:40 crc kubenswrapper[4923]: I0224 03:27:40.429934 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjd9z\" (UniqueName: \"kubernetes.io/projected/18639db6-7088-4b65-8001-731a94420b8b-kube-api-access-vjd9z\") pod \"redhat-operators-gvncl\" (UID: \"18639db6-7088-4b65-8001-731a94420b8b\") " pod="openshift-marketplace/redhat-operators-gvncl" Feb 24 03:27:40 crc kubenswrapper[4923]: I0224 03:27:40.430463 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18639db6-7088-4b65-8001-731a94420b8b-utilities\") pod \"redhat-operators-gvncl\" (UID: \"18639db6-7088-4b65-8001-731a94420b8b\") " pod="openshift-marketplace/redhat-operators-gvncl" Feb 24 03:27:40 crc kubenswrapper[4923]: I0224 03:27:40.430611 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18639db6-7088-4b65-8001-731a94420b8b-catalog-content\") pod \"redhat-operators-gvncl\" (UID: \"18639db6-7088-4b65-8001-731a94420b8b\") " pod="openshift-marketplace/redhat-operators-gvncl" Feb 24 03:27:40 crc kubenswrapper[4923]: I0224 03:27:40.450561 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjd9z\" (UniqueName: \"kubernetes.io/projected/18639db6-7088-4b65-8001-731a94420b8b-kube-api-access-vjd9z\") pod \"redhat-operators-gvncl\" (UID: \"18639db6-7088-4b65-8001-731a94420b8b\") " pod="openshift-marketplace/redhat-operators-gvncl" Feb 24 03:27:40 crc kubenswrapper[4923]: I0224 03:27:40.548919 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gvncl" Feb 24 03:27:41 crc kubenswrapper[4923]: I0224 03:27:41.009863 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gvncl"] Feb 24 03:27:41 crc kubenswrapper[4923]: I0224 03:27:41.919372 4923 generic.go:334] "Generic (PLEG): container finished" podID="18639db6-7088-4b65-8001-731a94420b8b" containerID="1ed07adba97d0e1a46e02be307d04ad59a5bfe3fc0e53528b79b51ca3d33f4c9" exitCode=0 Feb 24 03:27:41 crc kubenswrapper[4923]: I0224 03:27:41.920067 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvncl" event={"ID":"18639db6-7088-4b65-8001-731a94420b8b","Type":"ContainerDied","Data":"1ed07adba97d0e1a46e02be307d04ad59a5bfe3fc0e53528b79b51ca3d33f4c9"} Feb 24 03:27:41 crc kubenswrapper[4923]: I0224 03:27:41.920118 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvncl" event={"ID":"18639db6-7088-4b65-8001-731a94420b8b","Type":"ContainerStarted","Data":"685b8dcca6e8b6d34c889cb9af748b20010529415414619768bd540941a5b0bd"} Feb 24 03:27:42 crc kubenswrapper[4923]: I0224 03:27:42.929209 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvncl" event={"ID":"18639db6-7088-4b65-8001-731a94420b8b","Type":"ContainerStarted","Data":"9f8339541865717bf38125f3e9567f65bb977c1ea3a795877d635b1d7caa9de1"} Feb 24 03:27:43 crc kubenswrapper[4923]: I0224 03:27:43.713575 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:27:43 crc kubenswrapper[4923]: E0224 03:27:43.714285 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:27:45 crc kubenswrapper[4923]: I0224 03:27:45.958531 4923 generic.go:334] "Generic (PLEG): container finished" podID="18639db6-7088-4b65-8001-731a94420b8b" containerID="9f8339541865717bf38125f3e9567f65bb977c1ea3a795877d635b1d7caa9de1" exitCode=0 Feb 24 03:27:45 crc kubenswrapper[4923]: I0224 03:27:45.958643 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvncl" event={"ID":"18639db6-7088-4b65-8001-731a94420b8b","Type":"ContainerDied","Data":"9f8339541865717bf38125f3e9567f65bb977c1ea3a795877d635b1d7caa9de1"} Feb 24 03:27:46 crc kubenswrapper[4923]: I0224 03:27:46.972565 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvncl" event={"ID":"18639db6-7088-4b65-8001-731a94420b8b","Type":"ContainerStarted","Data":"5cc358164b000839a3118d731ea0a4b6264056546091bff0a8f1dc3636acb28b"} Feb 24 03:27:47 crc kubenswrapper[4923]: I0224 03:27:47.001873 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gvncl" podStartSLOduration=2.5542798920000003 podStartE2EDuration="7.001844269s" podCreationTimestamp="2026-02-24 03:27:40 +0000 UTC" firstStartedPulling="2026-02-24 03:27:41.923466798 +0000 UTC m=+1985.940537651" lastFinishedPulling="2026-02-24 03:27:46.371031175 +0000 UTC m=+1990.388102028" observedRunningTime="2026-02-24 03:27:46.997025134 +0000 UTC m=+1991.014095997" watchObservedRunningTime="2026-02-24 03:27:47.001844269 +0000 UTC m=+1991.018915152" Feb 24 03:27:50 crc kubenswrapper[4923]: I0224 03:27:50.549571 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gvncl" Feb 24 03:27:50 crc kubenswrapper[4923]: I0224 03:27:50.550282 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gvncl" Feb 24 03:27:51 crc kubenswrapper[4923]: I0224 03:27:51.613531 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gvncl" podUID="18639db6-7088-4b65-8001-731a94420b8b" containerName="registry-server" probeResult="failure" output=< Feb 24 03:27:51 crc kubenswrapper[4923]: timeout: failed to connect service ":50051" within 1s Feb 24 03:27:51 crc kubenswrapper[4923]: > Feb 24 03:27:58 crc kubenswrapper[4923]: I0224 03:27:58.713062 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:27:58 crc kubenswrapper[4923]: E0224 03:27:58.714122 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:28:00 crc kubenswrapper[4923]: I0224 03:28:00.590368 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gvncl" Feb 24 03:28:00 crc kubenswrapper[4923]: I0224 03:28:00.645644 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gvncl" Feb 24 03:28:00 crc kubenswrapper[4923]: I0224 03:28:00.830549 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gvncl"] Feb 24 03:28:02 crc kubenswrapper[4923]: I0224 03:28:02.122008 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gvncl" podUID="18639db6-7088-4b65-8001-731a94420b8b" containerName="registry-server" containerID="cri-o://5cc358164b000839a3118d731ea0a4b6264056546091bff0a8f1dc3636acb28b" gracePeriod=2 Feb 24 03:28:02 crc kubenswrapper[4923]: I0224 03:28:02.562885 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gvncl" Feb 24 03:28:02 crc kubenswrapper[4923]: I0224 03:28:02.758021 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18639db6-7088-4b65-8001-731a94420b8b-catalog-content\") pod \"18639db6-7088-4b65-8001-731a94420b8b\" (UID: \"18639db6-7088-4b65-8001-731a94420b8b\") " Feb 24 03:28:02 crc kubenswrapper[4923]: I0224 03:28:02.758262 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18639db6-7088-4b65-8001-731a94420b8b-utilities\") pod \"18639db6-7088-4b65-8001-731a94420b8b\" (UID: \"18639db6-7088-4b65-8001-731a94420b8b\") " Feb 24 03:28:02 crc kubenswrapper[4923]: I0224 03:28:02.758444 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjd9z\" (UniqueName: \"kubernetes.io/projected/18639db6-7088-4b65-8001-731a94420b8b-kube-api-access-vjd9z\") pod \"18639db6-7088-4b65-8001-731a94420b8b\" (UID: \"18639db6-7088-4b65-8001-731a94420b8b\") " Feb 24 03:28:02 crc kubenswrapper[4923]: I0224 03:28:02.759361 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18639db6-7088-4b65-8001-731a94420b8b-utilities" (OuterVolumeSpecName: "utilities") pod "18639db6-7088-4b65-8001-731a94420b8b" (UID: "18639db6-7088-4b65-8001-731a94420b8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:28:02 crc kubenswrapper[4923]: I0224 03:28:02.766581 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18639db6-7088-4b65-8001-731a94420b8b-kube-api-access-vjd9z" (OuterVolumeSpecName: "kube-api-access-vjd9z") pod "18639db6-7088-4b65-8001-731a94420b8b" (UID: "18639db6-7088-4b65-8001-731a94420b8b"). InnerVolumeSpecName "kube-api-access-vjd9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:28:02 crc kubenswrapper[4923]: I0224 03:28:02.860917 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18639db6-7088-4b65-8001-731a94420b8b-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:28:02 crc kubenswrapper[4923]: I0224 03:28:02.861183 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjd9z\" (UniqueName: \"kubernetes.io/projected/18639db6-7088-4b65-8001-731a94420b8b-kube-api-access-vjd9z\") on node \"crc\" DevicePath \"\"" Feb 24 03:28:02 crc kubenswrapper[4923]: I0224 03:28:02.885192 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18639db6-7088-4b65-8001-731a94420b8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18639db6-7088-4b65-8001-731a94420b8b" (UID: "18639db6-7088-4b65-8001-731a94420b8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:28:02 crc kubenswrapper[4923]: I0224 03:28:02.979703 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18639db6-7088-4b65-8001-731a94420b8b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.135416 4923 generic.go:334] "Generic (PLEG): container finished" podID="18639db6-7088-4b65-8001-731a94420b8b" containerID="5cc358164b000839a3118d731ea0a4b6264056546091bff0a8f1dc3636acb28b" exitCode=0 Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.135487 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvncl" event={"ID":"18639db6-7088-4b65-8001-731a94420b8b","Type":"ContainerDied","Data":"5cc358164b000839a3118d731ea0a4b6264056546091bff0a8f1dc3636acb28b"} Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.135528 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gvncl" event={"ID":"18639db6-7088-4b65-8001-731a94420b8b","Type":"ContainerDied","Data":"685b8dcca6e8b6d34c889cb9af748b20010529415414619768bd540941a5b0bd"} Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.135556 4923 scope.go:117] "RemoveContainer" containerID="5cc358164b000839a3118d731ea0a4b6264056546091bff0a8f1dc3636acb28b" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.136255 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gvncl" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.172096 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gvncl"] Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.172388 4923 scope.go:117] "RemoveContainer" containerID="9f8339541865717bf38125f3e9567f65bb977c1ea3a795877d635b1d7caa9de1" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.179716 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gvncl"] Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.196158 4923 scope.go:117] "RemoveContainer" containerID="1ed07adba97d0e1a46e02be307d04ad59a5bfe3fc0e53528b79b51ca3d33f4c9" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.235213 4923 scope.go:117] "RemoveContainer" containerID="5cc358164b000839a3118d731ea0a4b6264056546091bff0a8f1dc3636acb28b" Feb 24 03:28:03 crc kubenswrapper[4923]: E0224 03:28:03.235720 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cc358164b000839a3118d731ea0a4b6264056546091bff0a8f1dc3636acb28b\": container with ID starting with 5cc358164b000839a3118d731ea0a4b6264056546091bff0a8f1dc3636acb28b not found: ID does not exist" containerID="5cc358164b000839a3118d731ea0a4b6264056546091bff0a8f1dc3636acb28b" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.235758 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc358164b000839a3118d731ea0a4b6264056546091bff0a8f1dc3636acb28b"} err="failed to get container status \"5cc358164b000839a3118d731ea0a4b6264056546091bff0a8f1dc3636acb28b\": rpc error: code = NotFound desc = could not find container \"5cc358164b000839a3118d731ea0a4b6264056546091bff0a8f1dc3636acb28b\": container with ID starting with 5cc358164b000839a3118d731ea0a4b6264056546091bff0a8f1dc3636acb28b not found: ID does not exist" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.235785 4923 scope.go:117] "RemoveContainer" containerID="9f8339541865717bf38125f3e9567f65bb977c1ea3a795877d635b1d7caa9de1" Feb 24 03:28:03 crc kubenswrapper[4923]: E0224 03:28:03.236128 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f8339541865717bf38125f3e9567f65bb977c1ea3a795877d635b1d7caa9de1\": container with ID starting with 9f8339541865717bf38125f3e9567f65bb977c1ea3a795877d635b1d7caa9de1 not found: ID does not exist" containerID="9f8339541865717bf38125f3e9567f65bb977c1ea3a795877d635b1d7caa9de1" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.236162 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f8339541865717bf38125f3e9567f65bb977c1ea3a795877d635b1d7caa9de1"} err="failed to get container status \"9f8339541865717bf38125f3e9567f65bb977c1ea3a795877d635b1d7caa9de1\": rpc error: code = NotFound desc = could not find container \"9f8339541865717bf38125f3e9567f65bb977c1ea3a795877d635b1d7caa9de1\": container with ID starting with 9f8339541865717bf38125f3e9567f65bb977c1ea3a795877d635b1d7caa9de1 not found: ID does not exist" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.236184 4923 scope.go:117] "RemoveContainer" containerID="1ed07adba97d0e1a46e02be307d04ad59a5bfe3fc0e53528b79b51ca3d33f4c9" Feb 24 03:28:03 crc kubenswrapper[4923]: E0224 03:28:03.236484 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ed07adba97d0e1a46e02be307d04ad59a5bfe3fc0e53528b79b51ca3d33f4c9\": container with ID starting with 1ed07adba97d0e1a46e02be307d04ad59a5bfe3fc0e53528b79b51ca3d33f4c9 not found: ID does not exist" containerID="1ed07adba97d0e1a46e02be307d04ad59a5bfe3fc0e53528b79b51ca3d33f4c9" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.236508 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ed07adba97d0e1a46e02be307d04ad59a5bfe3fc0e53528b79b51ca3d33f4c9"} err="failed to get container status \"1ed07adba97d0e1a46e02be307d04ad59a5bfe3fc0e53528b79b51ca3d33f4c9\": rpc error: code = NotFound desc = could not find container \"1ed07adba97d0e1a46e02be307d04ad59a5bfe3fc0e53528b79b51ca3d33f4c9\": container with ID starting with 1ed07adba97d0e1a46e02be307d04ad59a5bfe3fc0e53528b79b51ca3d33f4c9 not found: ID does not exist" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.722259 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18639db6-7088-4b65-8001-731a94420b8b" path="/var/lib/kubelet/pods/18639db6-7088-4b65-8001-731a94420b8b/volumes" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.840453 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8sj6v"] Feb 24 03:28:03 crc kubenswrapper[4923]: E0224 03:28:03.840958 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18639db6-7088-4b65-8001-731a94420b8b" containerName="registry-server" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.840984 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="18639db6-7088-4b65-8001-731a94420b8b" containerName="registry-server" Feb 24 03:28:03 crc kubenswrapper[4923]: E0224 03:28:03.841022 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18639db6-7088-4b65-8001-731a94420b8b" containerName="extract-utilities" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.841033 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="18639db6-7088-4b65-8001-731a94420b8b" containerName="extract-utilities" Feb 24 03:28:03 crc kubenswrapper[4923]: E0224 03:28:03.841057 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18639db6-7088-4b65-8001-731a94420b8b" containerName="extract-content" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.841066 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="18639db6-7088-4b65-8001-731a94420b8b" containerName="extract-content" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.841310 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="18639db6-7088-4b65-8001-731a94420b8b" containerName="registry-server" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.843250 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8sj6v" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.855573 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8sj6v"] Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.896504 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a5ede98-a21b-4063-80f3-72a3b3c0c40c-catalog-content\") pod \"redhat-marketplace-8sj6v\" (UID: \"4a5ede98-a21b-4063-80f3-72a3b3c0c40c\") " pod="openshift-marketplace/redhat-marketplace-8sj6v" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.896723 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thbr2\" (UniqueName: \"kubernetes.io/projected/4a5ede98-a21b-4063-80f3-72a3b3c0c40c-kube-api-access-thbr2\") pod \"redhat-marketplace-8sj6v\" (UID: \"4a5ede98-a21b-4063-80f3-72a3b3c0c40c\") " pod="openshift-marketplace/redhat-marketplace-8sj6v" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.896834 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a5ede98-a21b-4063-80f3-72a3b3c0c40c-utilities\") pod \"redhat-marketplace-8sj6v\" (UID: \"4a5ede98-a21b-4063-80f3-72a3b3c0c40c\") " pod="openshift-marketplace/redhat-marketplace-8sj6v" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.998774 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a5ede98-a21b-4063-80f3-72a3b3c0c40c-catalog-content\") pod \"redhat-marketplace-8sj6v\" (UID: \"4a5ede98-a21b-4063-80f3-72a3b3c0c40c\") " pod="openshift-marketplace/redhat-marketplace-8sj6v" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.998931 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thbr2\" (UniqueName: \"kubernetes.io/projected/4a5ede98-a21b-4063-80f3-72a3b3c0c40c-kube-api-access-thbr2\") pod \"redhat-marketplace-8sj6v\" (UID: \"4a5ede98-a21b-4063-80f3-72a3b3c0c40c\") " pod="openshift-marketplace/redhat-marketplace-8sj6v" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.999002 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a5ede98-a21b-4063-80f3-72a3b3c0c40c-utilities\") pod \"redhat-marketplace-8sj6v\" (UID: \"4a5ede98-a21b-4063-80f3-72a3b3c0c40c\") " pod="openshift-marketplace/redhat-marketplace-8sj6v" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.999694 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a5ede98-a21b-4063-80f3-72a3b3c0c40c-utilities\") pod \"redhat-marketplace-8sj6v\" (UID: \"4a5ede98-a21b-4063-80f3-72a3b3c0c40c\") " pod="openshift-marketplace/redhat-marketplace-8sj6v" Feb 24 03:28:03 crc kubenswrapper[4923]: I0224 03:28:03.999709 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a5ede98-a21b-4063-80f3-72a3b3c0c40c-catalog-content\") pod \"redhat-marketplace-8sj6v\" (UID: \"4a5ede98-a21b-4063-80f3-72a3b3c0c40c\") " pod="openshift-marketplace/redhat-marketplace-8sj6v" Feb 24 03:28:04 crc kubenswrapper[4923]: I0224 03:28:04.018105 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thbr2\" (UniqueName: \"kubernetes.io/projected/4a5ede98-a21b-4063-80f3-72a3b3c0c40c-kube-api-access-thbr2\") pod \"redhat-marketplace-8sj6v\" (UID: \"4a5ede98-a21b-4063-80f3-72a3b3c0c40c\") " pod="openshift-marketplace/redhat-marketplace-8sj6v" Feb 24 03:28:04 crc kubenswrapper[4923]: I0224 03:28:04.172187 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8sj6v" Feb 24 03:28:04 crc kubenswrapper[4923]: I0224 03:28:04.448609 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8sj6v"] Feb 24 03:28:05 crc kubenswrapper[4923]: I0224 03:28:05.153494 4923 generic.go:334] "Generic (PLEG): container finished" podID="4a5ede98-a21b-4063-80f3-72a3b3c0c40c" containerID="9747e55692b154c64dbe1eb91b8d9ceb2dcdf854594a6321dd43fd0751549869" exitCode=0 Feb 24 03:28:05 crc kubenswrapper[4923]: I0224 03:28:05.153513 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8sj6v" event={"ID":"4a5ede98-a21b-4063-80f3-72a3b3c0c40c","Type":"ContainerDied","Data":"9747e55692b154c64dbe1eb91b8d9ceb2dcdf854594a6321dd43fd0751549869"} Feb 24 03:28:05 crc kubenswrapper[4923]: I0224 03:28:05.153562 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8sj6v" event={"ID":"4a5ede98-a21b-4063-80f3-72a3b3c0c40c","Type":"ContainerStarted","Data":"0f6b648218a635482e3cd53cab5b2e1a558952c0b06df3426d7e29a299b97203"} Feb 24 03:28:05 crc kubenswrapper[4923]: I0224 03:28:05.155332 4923 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 03:28:06 crc kubenswrapper[4923]: I0224 03:28:06.163150 4923 generic.go:334] "Generic (PLEG): container finished" podID="4a5ede98-a21b-4063-80f3-72a3b3c0c40c" containerID="cc41baec78a28d07ed37f0a5bf50adc52fb69124f714c17b1e2b8c44d69dd410" exitCode=0 Feb 24 03:28:06 crc kubenswrapper[4923]: I0224 03:28:06.163222 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8sj6v" event={"ID":"4a5ede98-a21b-4063-80f3-72a3b3c0c40c","Type":"ContainerDied","Data":"cc41baec78a28d07ed37f0a5bf50adc52fb69124f714c17b1e2b8c44d69dd410"} Feb 24 03:28:07 crc kubenswrapper[4923]: I0224 03:28:07.173894 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8sj6v" event={"ID":"4a5ede98-a21b-4063-80f3-72a3b3c0c40c","Type":"ContainerStarted","Data":"929e6199a785223ff382886c6080de1e64ecd66fbf2a6f8ec222df1fbd473e8d"} Feb 24 03:28:07 crc kubenswrapper[4923]: I0224 03:28:07.202225 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8sj6v" podStartSLOduration=2.766395348 podStartE2EDuration="4.202204265s" podCreationTimestamp="2026-02-24 03:28:03 +0000 UTC" firstStartedPulling="2026-02-24 03:28:05.155088162 +0000 UTC m=+2009.172158975" lastFinishedPulling="2026-02-24 03:28:06.590897079 +0000 UTC m=+2010.607967892" observedRunningTime="2026-02-24 03:28:07.196559927 +0000 UTC m=+2011.213630740" watchObservedRunningTime="2026-02-24 03:28:07.202204265 +0000 UTC m=+2011.219275078" Feb 24 03:28:13 crc kubenswrapper[4923]: I0224 03:28:13.713491 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:28:13 crc kubenswrapper[4923]: E0224 03:28:13.714621 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:28:14 crc kubenswrapper[4923]: I0224 03:28:14.172533 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8sj6v" Feb 24 03:28:14 crc kubenswrapper[4923]: I0224 03:28:14.172986 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8sj6v" Feb 24 03:28:14 crc kubenswrapper[4923]: I0224 03:28:14.245818 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8sj6v" Feb 24 03:28:14 crc kubenswrapper[4923]: I0224 03:28:14.330152 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8sj6v" Feb 24 03:28:14 crc kubenswrapper[4923]: I0224 03:28:14.492547 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8sj6v"] Feb 24 03:28:16 crc kubenswrapper[4923]: I0224 03:28:16.268768 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8sj6v" podUID="4a5ede98-a21b-4063-80f3-72a3b3c0c40c" containerName="registry-server" containerID="cri-o://929e6199a785223ff382886c6080de1e64ecd66fbf2a6f8ec222df1fbd473e8d" gracePeriod=2 Feb 24 03:28:16 crc kubenswrapper[4923]: I0224 03:28:16.827099 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8sj6v" Feb 24 03:28:16 crc kubenswrapper[4923]: I0224 03:28:16.894966 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a5ede98-a21b-4063-80f3-72a3b3c0c40c-utilities\") pod \"4a5ede98-a21b-4063-80f3-72a3b3c0c40c\" (UID: \"4a5ede98-a21b-4063-80f3-72a3b3c0c40c\") " Feb 24 03:28:16 crc kubenswrapper[4923]: I0224 03:28:16.895031 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a5ede98-a21b-4063-80f3-72a3b3c0c40c-catalog-content\") pod \"4a5ede98-a21b-4063-80f3-72a3b3c0c40c\" (UID: \"4a5ede98-a21b-4063-80f3-72a3b3c0c40c\") " Feb 24 03:28:16 crc kubenswrapper[4923]: I0224 03:28:16.895117 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thbr2\" (UniqueName: \"kubernetes.io/projected/4a5ede98-a21b-4063-80f3-72a3b3c0c40c-kube-api-access-thbr2\") pod \"4a5ede98-a21b-4063-80f3-72a3b3c0c40c\" (UID: \"4a5ede98-a21b-4063-80f3-72a3b3c0c40c\") " Feb 24 03:28:16 crc kubenswrapper[4923]: I0224 03:28:16.896099 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a5ede98-a21b-4063-80f3-72a3b3c0c40c-utilities" (OuterVolumeSpecName: "utilities") pod "4a5ede98-a21b-4063-80f3-72a3b3c0c40c" (UID: "4a5ede98-a21b-4063-80f3-72a3b3c0c40c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:28:16 crc kubenswrapper[4923]: I0224 03:28:16.903213 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a5ede98-a21b-4063-80f3-72a3b3c0c40c-kube-api-access-thbr2" (OuterVolumeSpecName: "kube-api-access-thbr2") pod "4a5ede98-a21b-4063-80f3-72a3b3c0c40c" (UID: "4a5ede98-a21b-4063-80f3-72a3b3c0c40c"). InnerVolumeSpecName "kube-api-access-thbr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:28:16 crc kubenswrapper[4923]: I0224 03:28:16.928498 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a5ede98-a21b-4063-80f3-72a3b3c0c40c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a5ede98-a21b-4063-80f3-72a3b3c0c40c" (UID: "4a5ede98-a21b-4063-80f3-72a3b3c0c40c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:28:16 crc kubenswrapper[4923]: I0224 03:28:16.997520 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a5ede98-a21b-4063-80f3-72a3b3c0c40c-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:28:16 crc kubenswrapper[4923]: I0224 03:28:16.997563 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a5ede98-a21b-4063-80f3-72a3b3c0c40c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:28:16 crc kubenswrapper[4923]: I0224 03:28:16.997575 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thbr2\" (UniqueName: \"kubernetes.io/projected/4a5ede98-a21b-4063-80f3-72a3b3c0c40c-kube-api-access-thbr2\") on node \"crc\" DevicePath \"\"" Feb 24 03:28:17 crc kubenswrapper[4923]: I0224 03:28:17.281470 4923 generic.go:334] "Generic (PLEG): container finished" podID="4a5ede98-a21b-4063-80f3-72a3b3c0c40c" containerID="929e6199a785223ff382886c6080de1e64ecd66fbf2a6f8ec222df1fbd473e8d" exitCode=0 Feb 24 03:28:17 crc kubenswrapper[4923]: I0224 03:28:17.281507 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8sj6v" event={"ID":"4a5ede98-a21b-4063-80f3-72a3b3c0c40c","Type":"ContainerDied","Data":"929e6199a785223ff382886c6080de1e64ecd66fbf2a6f8ec222df1fbd473e8d"} Feb 24 03:28:17 crc kubenswrapper[4923]: I0224 03:28:17.281547 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8sj6v" event={"ID":"4a5ede98-a21b-4063-80f3-72a3b3c0c40c","Type":"ContainerDied","Data":"0f6b648218a635482e3cd53cab5b2e1a558952c0b06df3426d7e29a299b97203"} Feb 24 03:28:17 crc kubenswrapper[4923]: I0224 03:28:17.281559 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8sj6v" Feb 24 03:28:17 crc kubenswrapper[4923]: I0224 03:28:17.281566 4923 scope.go:117] "RemoveContainer" containerID="929e6199a785223ff382886c6080de1e64ecd66fbf2a6f8ec222df1fbd473e8d" Feb 24 03:28:17 crc kubenswrapper[4923]: I0224 03:28:17.315994 4923 scope.go:117] "RemoveContainer" containerID="cc41baec78a28d07ed37f0a5bf50adc52fb69124f714c17b1e2b8c44d69dd410" Feb 24 03:28:17 crc kubenswrapper[4923]: I0224 03:28:17.327464 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8sj6v"] Feb 24 03:28:17 crc kubenswrapper[4923]: I0224 03:28:17.339795 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8sj6v"] Feb 24 03:28:17 crc kubenswrapper[4923]: I0224 03:28:17.353524 4923 scope.go:117] "RemoveContainer" containerID="9747e55692b154c64dbe1eb91b8d9ceb2dcdf854594a6321dd43fd0751549869" Feb 24 03:28:17 crc kubenswrapper[4923]: I0224 03:28:17.419856 4923 scope.go:117] "RemoveContainer" containerID="929e6199a785223ff382886c6080de1e64ecd66fbf2a6f8ec222df1fbd473e8d" Feb 24 03:28:17 crc kubenswrapper[4923]: E0224 03:28:17.420601 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"929e6199a785223ff382886c6080de1e64ecd66fbf2a6f8ec222df1fbd473e8d\": container with ID starting with 929e6199a785223ff382886c6080de1e64ecd66fbf2a6f8ec222df1fbd473e8d not found: ID does not exist" containerID="929e6199a785223ff382886c6080de1e64ecd66fbf2a6f8ec222df1fbd473e8d" Feb 24 03:28:17 crc kubenswrapper[4923]: I0224 03:28:17.420681 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"929e6199a785223ff382886c6080de1e64ecd66fbf2a6f8ec222df1fbd473e8d"} err="failed to get container status \"929e6199a785223ff382886c6080de1e64ecd66fbf2a6f8ec222df1fbd473e8d\": rpc error: code = NotFound desc = could not find container \"929e6199a785223ff382886c6080de1e64ecd66fbf2a6f8ec222df1fbd473e8d\": container with ID starting with 929e6199a785223ff382886c6080de1e64ecd66fbf2a6f8ec222df1fbd473e8d not found: ID does not exist" Feb 24 03:28:17 crc kubenswrapper[4923]: I0224 03:28:17.420720 4923 scope.go:117] "RemoveContainer" containerID="cc41baec78a28d07ed37f0a5bf50adc52fb69124f714c17b1e2b8c44d69dd410" Feb 24 03:28:17 crc kubenswrapper[4923]: E0224 03:28:17.421494 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc41baec78a28d07ed37f0a5bf50adc52fb69124f714c17b1e2b8c44d69dd410\": container with ID starting with cc41baec78a28d07ed37f0a5bf50adc52fb69124f714c17b1e2b8c44d69dd410 not found: ID does not exist" containerID="cc41baec78a28d07ed37f0a5bf50adc52fb69124f714c17b1e2b8c44d69dd410" Feb 24 03:28:17 crc kubenswrapper[4923]: I0224 03:28:17.421541 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc41baec78a28d07ed37f0a5bf50adc52fb69124f714c17b1e2b8c44d69dd410"} err="failed to get container status \"cc41baec78a28d07ed37f0a5bf50adc52fb69124f714c17b1e2b8c44d69dd410\": rpc error: code = NotFound desc = could not find container \"cc41baec78a28d07ed37f0a5bf50adc52fb69124f714c17b1e2b8c44d69dd410\": container with ID starting with cc41baec78a28d07ed37f0a5bf50adc52fb69124f714c17b1e2b8c44d69dd410 not found: ID does not exist" Feb 24 03:28:17 crc kubenswrapper[4923]: I0224 03:28:17.421566 4923 scope.go:117] "RemoveContainer" containerID="9747e55692b154c64dbe1eb91b8d9ceb2dcdf854594a6321dd43fd0751549869" Feb 24 03:28:17 crc kubenswrapper[4923]: E0224 03:28:17.421891 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9747e55692b154c64dbe1eb91b8d9ceb2dcdf854594a6321dd43fd0751549869\": container with ID starting with 9747e55692b154c64dbe1eb91b8d9ceb2dcdf854594a6321dd43fd0751549869 not found: ID does not exist" containerID="9747e55692b154c64dbe1eb91b8d9ceb2dcdf854594a6321dd43fd0751549869" Feb 24 03:28:17 crc kubenswrapper[4923]: I0224 03:28:17.421933 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9747e55692b154c64dbe1eb91b8d9ceb2dcdf854594a6321dd43fd0751549869"} err="failed to get container status \"9747e55692b154c64dbe1eb91b8d9ceb2dcdf854594a6321dd43fd0751549869\": rpc error: code = NotFound desc = could not find container \"9747e55692b154c64dbe1eb91b8d9ceb2dcdf854594a6321dd43fd0751549869\": container with ID starting with 9747e55692b154c64dbe1eb91b8d9ceb2dcdf854594a6321dd43fd0751549869 not found: ID does not exist" Feb 24 03:28:17 crc kubenswrapper[4923]: I0224 03:28:17.742410 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a5ede98-a21b-4063-80f3-72a3b3c0c40c" path="/var/lib/kubelet/pods/4a5ede98-a21b-4063-80f3-72a3b3c0c40c/volumes" Feb 24 03:28:19 crc kubenswrapper[4923]: I0224 03:28:19.306929 4923 generic.go:334] "Generic (PLEG): container finished" podID="543b3843-407e-4043-a851-4170590b5a68" containerID="573345c0a098700ae30230ab42e62e02559e6da7029299cd1b5f8f27a66169d6" exitCode=0 Feb 24 03:28:19 crc kubenswrapper[4923]: I0224 03:28:19.306996 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" event={"ID":"543b3843-407e-4043-a851-4170590b5a68","Type":"ContainerDied","Data":"573345c0a098700ae30230ab42e62e02559e6da7029299cd1b5f8f27a66169d6"} Feb 24 03:28:20 crc kubenswrapper[4923]: I0224 03:28:20.871551 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:28:20 crc kubenswrapper[4923]: I0224 03:28:20.989936 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/543b3843-407e-4043-a851-4170590b5a68-ovncontroller-config-0\") pod \"543b3843-407e-4043-a851-4170590b5a68\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " Feb 24 03:28:20 crc kubenswrapper[4923]: I0224 03:28:20.990202 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543b3843-407e-4043-a851-4170590b5a68-ovn-combined-ca-bundle\") pod \"543b3843-407e-4043-a851-4170590b5a68\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " Feb 24 03:28:20 crc kubenswrapper[4923]: I0224 03:28:20.990359 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/543b3843-407e-4043-a851-4170590b5a68-ssh-key-openstack-edpm-ipam\") pod \"543b3843-407e-4043-a851-4170590b5a68\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " Feb 24 03:28:20 crc kubenswrapper[4923]: I0224 03:28:20.990448 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/543b3843-407e-4043-a851-4170590b5a68-inventory\") pod \"543b3843-407e-4043-a851-4170590b5a68\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " Feb 24 03:28:20 crc kubenswrapper[4923]: I0224 03:28:20.990610 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrdxp\" (UniqueName: \"kubernetes.io/projected/543b3843-407e-4043-a851-4170590b5a68-kube-api-access-qrdxp\") pod \"543b3843-407e-4043-a851-4170590b5a68\" (UID: \"543b3843-407e-4043-a851-4170590b5a68\") " Feb 24 03:28:20 crc kubenswrapper[4923]: I0224 03:28:20.995568 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/543b3843-407e-4043-a851-4170590b5a68-kube-api-access-qrdxp" (OuterVolumeSpecName: "kube-api-access-qrdxp") pod "543b3843-407e-4043-a851-4170590b5a68" (UID: "543b3843-407e-4043-a851-4170590b5a68"). InnerVolumeSpecName "kube-api-access-qrdxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:28:20 crc kubenswrapper[4923]: I0224 03:28:20.995841 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/543b3843-407e-4043-a851-4170590b5a68-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "543b3843-407e-4043-a851-4170590b5a68" (UID: "543b3843-407e-4043-a851-4170590b5a68"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.013320 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/543b3843-407e-4043-a851-4170590b5a68-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "543b3843-407e-4043-a851-4170590b5a68" (UID: "543b3843-407e-4043-a851-4170590b5a68"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.014525 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/543b3843-407e-4043-a851-4170590b5a68-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "543b3843-407e-4043-a851-4170590b5a68" (UID: "543b3843-407e-4043-a851-4170590b5a68"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.016575 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/543b3843-407e-4043-a851-4170590b5a68-inventory" (OuterVolumeSpecName: "inventory") pod "543b3843-407e-4043-a851-4170590b5a68" (UID: "543b3843-407e-4043-a851-4170590b5a68"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.093127 4923 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/543b3843-407e-4043-a851-4170590b5a68-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.093157 4923 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/543b3843-407e-4043-a851-4170590b5a68-inventory\") on node \"crc\" DevicePath \"\"" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.093166 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrdxp\" (UniqueName: \"kubernetes.io/projected/543b3843-407e-4043-a851-4170590b5a68-kube-api-access-qrdxp\") on node \"crc\" DevicePath \"\"" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.093174 4923 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/543b3843-407e-4043-a851-4170590b5a68-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.093185 4923 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543b3843-407e-4043-a851-4170590b5a68-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.332760 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" event={"ID":"543b3843-407e-4043-a851-4170590b5a68","Type":"ContainerDied","Data":"df63bbc3c83c763b023b7dc5677680bb6261433f1b474c17d7af1e268800635d"} Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.332799 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df63bbc3c83c763b023b7dc5677680bb6261433f1b474c17d7af1e268800635d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.332816 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hsl52" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.427740 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d"] Feb 24 03:28:21 crc kubenswrapper[4923]: E0224 03:28:21.428111 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a5ede98-a21b-4063-80f3-72a3b3c0c40c" containerName="registry-server" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.428130 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a5ede98-a21b-4063-80f3-72a3b3c0c40c" containerName="registry-server" Feb 24 03:28:21 crc kubenswrapper[4923]: E0224 03:28:21.428146 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a5ede98-a21b-4063-80f3-72a3b3c0c40c" containerName="extract-utilities" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.428153 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a5ede98-a21b-4063-80f3-72a3b3c0c40c" containerName="extract-utilities" Feb 24 03:28:21 crc kubenswrapper[4923]: E0224 03:28:21.428165 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543b3843-407e-4043-a851-4170590b5a68" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.428172 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="543b3843-407e-4043-a851-4170590b5a68" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 24 03:28:21 crc kubenswrapper[4923]: E0224 03:28:21.428194 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a5ede98-a21b-4063-80f3-72a3b3c0c40c" containerName="extract-content" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.428200 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a5ede98-a21b-4063-80f3-72a3b3c0c40c" containerName="extract-content" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.428377 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="543b3843-407e-4043-a851-4170590b5a68" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.428406 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a5ede98-a21b-4063-80f3-72a3b3c0c40c" containerName="registry-server" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.428985 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.431278 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.431599 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.431816 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.431945 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fgpt8" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.432077 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.432196 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.444572 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d"] Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.603589 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.604043 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg5j7\" (UniqueName: \"kubernetes.io/projected/a55af564-f005-452b-acb3-8fa3910b1485-kube-api-access-sg5j7\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.604090 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.604144 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.604211 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.604422 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.706764 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.706855 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.707060 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.707256 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.707458 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg5j7\" (UniqueName: \"kubernetes.io/projected/a55af564-f005-452b-acb3-8fa3910b1485-kube-api-access-sg5j7\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.707526 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.713851 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.714250 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.715454 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.719868 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.721968 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.731004 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg5j7\" (UniqueName: \"kubernetes.io/projected/a55af564-f005-452b-acb3-8fa3910b1485-kube-api-access-sg5j7\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:21 crc kubenswrapper[4923]: I0224 03:28:21.752025 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:28:22 crc kubenswrapper[4923]: I0224 03:28:22.318116 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d"] Feb 24 03:28:22 crc kubenswrapper[4923]: I0224 03:28:22.344340 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" event={"ID":"a55af564-f005-452b-acb3-8fa3910b1485","Type":"ContainerStarted","Data":"1c0075b6f2d57e9e842efd1a2fcfa9b86324adb1c2ca933f82baa90f581ecdd3"} Feb 24 03:28:23 crc kubenswrapper[4923]: I0224 03:28:23.352371 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" event={"ID":"a55af564-f005-452b-acb3-8fa3910b1485","Type":"ContainerStarted","Data":"1477d85067bfa23f4e73811ec9527de0902d862a1596509f1d3df6aaff5a9c1a"} Feb 24 03:28:23 crc kubenswrapper[4923]: I0224 03:28:23.373703 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" podStartSLOduration=1.907539374 podStartE2EDuration="2.37368162s" podCreationTimestamp="2026-02-24 03:28:21 +0000 UTC" firstStartedPulling="2026-02-24 03:28:22.323426078 +0000 UTC m=+2026.340496891" lastFinishedPulling="2026-02-24 03:28:22.789568324 +0000 UTC m=+2026.806639137" observedRunningTime="2026-02-24 03:28:23.365967839 +0000 UTC m=+2027.383038652" watchObservedRunningTime="2026-02-24 03:28:23.37368162 +0000 UTC m=+2027.390752443" Feb 24 03:28:28 crc kubenswrapper[4923]: I0224 03:28:28.713860 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:28:28 crc kubenswrapper[4923]: E0224 03:28:28.714514 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:28:42 crc kubenswrapper[4923]: I0224 03:28:42.713728 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:28:42 crc kubenswrapper[4923]: E0224 03:28:42.715102 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:28:55 crc kubenswrapper[4923]: I0224 03:28:55.036979 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bl6bp"] Feb 24 03:28:55 crc kubenswrapper[4923]: I0224 03:28:55.040837 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bl6bp" Feb 24 03:28:55 crc kubenswrapper[4923]: I0224 03:28:55.057184 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bl6bp"] Feb 24 03:28:55 crc kubenswrapper[4923]: I0224 03:28:55.155843 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae0eeff2-3ae3-49da-97b3-55815d2a92c4-utilities\") pod \"community-operators-bl6bp\" (UID: \"ae0eeff2-3ae3-49da-97b3-55815d2a92c4\") " pod="openshift-marketplace/community-operators-bl6bp" Feb 24 03:28:55 crc kubenswrapper[4923]: I0224 03:28:55.156170 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae0eeff2-3ae3-49da-97b3-55815d2a92c4-catalog-content\") pod \"community-operators-bl6bp\" (UID: \"ae0eeff2-3ae3-49da-97b3-55815d2a92c4\") " pod="openshift-marketplace/community-operators-bl6bp" Feb 24 03:28:55 crc kubenswrapper[4923]: I0224 03:28:55.156329 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg55d\" (UniqueName: \"kubernetes.io/projected/ae0eeff2-3ae3-49da-97b3-55815d2a92c4-kube-api-access-bg55d\") pod \"community-operators-bl6bp\" (UID: \"ae0eeff2-3ae3-49da-97b3-55815d2a92c4\") " pod="openshift-marketplace/community-operators-bl6bp" Feb 24 03:28:55 crc kubenswrapper[4923]: I0224 03:28:55.258008 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae0eeff2-3ae3-49da-97b3-55815d2a92c4-utilities\") pod \"community-operators-bl6bp\" (UID: \"ae0eeff2-3ae3-49da-97b3-55815d2a92c4\") " pod="openshift-marketplace/community-operators-bl6bp" Feb 24 03:28:55 crc kubenswrapper[4923]: I0224 03:28:55.258073 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae0eeff2-3ae3-49da-97b3-55815d2a92c4-catalog-content\") pod \"community-operators-bl6bp\" (UID: \"ae0eeff2-3ae3-49da-97b3-55815d2a92c4\") " pod="openshift-marketplace/community-operators-bl6bp" Feb 24 03:28:55 crc kubenswrapper[4923]: I0224 03:28:55.258102 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg55d\" (UniqueName: \"kubernetes.io/projected/ae0eeff2-3ae3-49da-97b3-55815d2a92c4-kube-api-access-bg55d\") pod \"community-operators-bl6bp\" (UID: \"ae0eeff2-3ae3-49da-97b3-55815d2a92c4\") " pod="openshift-marketplace/community-operators-bl6bp" Feb 24 03:28:55 crc kubenswrapper[4923]: I0224 03:28:55.258732 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae0eeff2-3ae3-49da-97b3-55815d2a92c4-utilities\") pod \"community-operators-bl6bp\" (UID: \"ae0eeff2-3ae3-49da-97b3-55815d2a92c4\") " pod="openshift-marketplace/community-operators-bl6bp" Feb 24 03:28:55 crc kubenswrapper[4923]: I0224 03:28:55.258744 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae0eeff2-3ae3-49da-97b3-55815d2a92c4-catalog-content\") pod \"community-operators-bl6bp\" (UID: \"ae0eeff2-3ae3-49da-97b3-55815d2a92c4\") " pod="openshift-marketplace/community-operators-bl6bp" Feb 24 03:28:55 crc kubenswrapper[4923]: I0224 03:28:55.281476 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg55d\" (UniqueName: \"kubernetes.io/projected/ae0eeff2-3ae3-49da-97b3-55815d2a92c4-kube-api-access-bg55d\") pod \"community-operators-bl6bp\" (UID: \"ae0eeff2-3ae3-49da-97b3-55815d2a92c4\") " pod="openshift-marketplace/community-operators-bl6bp" Feb 24 03:28:55 crc kubenswrapper[4923]: I0224 03:28:55.367853 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bl6bp" Feb 24 03:28:55 crc kubenswrapper[4923]: I0224 03:28:55.872091 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bl6bp"] Feb 24 03:28:56 crc kubenswrapper[4923]: E0224 03:28:56.205538 4923 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae0eeff2_3ae3_49da_97b3_55815d2a92c4.slice/crio-conmon-65712961d3f364e69115efc5c2c2be2364bc741e44ccc636e01031d2004d2a74.scope\": RecentStats: unable to find data in memory cache]" Feb 24 03:28:56 crc kubenswrapper[4923]: I0224 03:28:56.683711 4923 generic.go:334] "Generic (PLEG): container finished" podID="ae0eeff2-3ae3-49da-97b3-55815d2a92c4" containerID="65712961d3f364e69115efc5c2c2be2364bc741e44ccc636e01031d2004d2a74" exitCode=0 Feb 24 03:28:56 crc kubenswrapper[4923]: I0224 03:28:56.683798 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bl6bp" event={"ID":"ae0eeff2-3ae3-49da-97b3-55815d2a92c4","Type":"ContainerDied","Data":"65712961d3f364e69115efc5c2c2be2364bc741e44ccc636e01031d2004d2a74"} Feb 24 03:28:56 crc kubenswrapper[4923]: I0224 03:28:56.684072 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bl6bp" event={"ID":"ae0eeff2-3ae3-49da-97b3-55815d2a92c4","Type":"ContainerStarted","Data":"256aa79024fbe76c864d015056ab4c63826a98dfa12b0b1a1eb9e2d5bd0d7773"} Feb 24 03:28:57 crc kubenswrapper[4923]: I0224 03:28:57.693926 4923 generic.go:334] "Generic (PLEG): container finished" podID="ae0eeff2-3ae3-49da-97b3-55815d2a92c4" containerID="51f120bb496be7e7edfa0b8983d299ff77f977625a2de1e83abf3ca60c5df70a" exitCode=0 Feb 24 03:28:57 crc kubenswrapper[4923]: I0224 03:28:57.694222 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bl6bp" event={"ID":"ae0eeff2-3ae3-49da-97b3-55815d2a92c4","Type":"ContainerDied","Data":"51f120bb496be7e7edfa0b8983d299ff77f977625a2de1e83abf3ca60c5df70a"} Feb 24 03:28:57 crc kubenswrapper[4923]: I0224 03:28:57.719828 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:28:57 crc kubenswrapper[4923]: E0224 03:28:57.720411 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:28:58 crc kubenswrapper[4923]: I0224 03:28:58.704704 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bl6bp" event={"ID":"ae0eeff2-3ae3-49da-97b3-55815d2a92c4","Type":"ContainerStarted","Data":"10b517b7f3188ed8eb019bbc8c2b2568ff9a3862be04924c09890a4611893c66"} Feb 24 03:28:58 crc kubenswrapper[4923]: I0224 03:28:58.723855 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bl6bp" podStartSLOduration=2.322796265 podStartE2EDuration="3.723840314s" podCreationTimestamp="2026-02-24 03:28:55 +0000 UTC" firstStartedPulling="2026-02-24 03:28:56.685707727 +0000 UTC m=+2060.702778540" lastFinishedPulling="2026-02-24 03:28:58.086751776 +0000 UTC m=+2062.103822589" observedRunningTime="2026-02-24 03:28:58.719179233 +0000 UTC m=+2062.736250046" watchObservedRunningTime="2026-02-24 03:28:58.723840314 +0000 UTC m=+2062.740911127" Feb 24 03:29:05 crc kubenswrapper[4923]: I0224 03:29:05.368117 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bl6bp" Feb 24 03:29:05 crc kubenswrapper[4923]: I0224 03:29:05.368894 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bl6bp" Feb 24 03:29:05 crc kubenswrapper[4923]: I0224 03:29:05.431591 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bl6bp" Feb 24 03:29:05 crc kubenswrapper[4923]: I0224 03:29:05.839196 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bl6bp" Feb 24 03:29:05 crc kubenswrapper[4923]: I0224 03:29:05.929045 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bl6bp"] Feb 24 03:29:07 crc kubenswrapper[4923]: I0224 03:29:07.795572 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bl6bp" podUID="ae0eeff2-3ae3-49da-97b3-55815d2a92c4" containerName="registry-server" containerID="cri-o://10b517b7f3188ed8eb019bbc8c2b2568ff9a3862be04924c09890a4611893c66" gracePeriod=2 Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.092144 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pm6k2"] Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.094000 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pm6k2" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.116203 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pm6k2"] Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.237737 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c19d7570-8aa3-4699-8824-0f800d93504c-utilities\") pod \"certified-operators-pm6k2\" (UID: \"c19d7570-8aa3-4699-8824-0f800d93504c\") " pod="openshift-marketplace/certified-operators-pm6k2" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.238089 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c19d7570-8aa3-4699-8824-0f800d93504c-catalog-content\") pod \"certified-operators-pm6k2\" (UID: \"c19d7570-8aa3-4699-8824-0f800d93504c\") " pod="openshift-marketplace/certified-operators-pm6k2" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.238160 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gftxx\" (UniqueName: \"kubernetes.io/projected/c19d7570-8aa3-4699-8824-0f800d93504c-kube-api-access-gftxx\") pod \"certified-operators-pm6k2\" (UID: \"c19d7570-8aa3-4699-8824-0f800d93504c\") " pod="openshift-marketplace/certified-operators-pm6k2" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.269264 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bl6bp" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.339582 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c19d7570-8aa3-4699-8824-0f800d93504c-utilities\") pod \"certified-operators-pm6k2\" (UID: \"c19d7570-8aa3-4699-8824-0f800d93504c\") " pod="openshift-marketplace/certified-operators-pm6k2" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.339662 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c19d7570-8aa3-4699-8824-0f800d93504c-catalog-content\") pod \"certified-operators-pm6k2\" (UID: \"c19d7570-8aa3-4699-8824-0f800d93504c\") " pod="openshift-marketplace/certified-operators-pm6k2" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.339739 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gftxx\" (UniqueName: \"kubernetes.io/projected/c19d7570-8aa3-4699-8824-0f800d93504c-kube-api-access-gftxx\") pod \"certified-operators-pm6k2\" (UID: \"c19d7570-8aa3-4699-8824-0f800d93504c\") " pod="openshift-marketplace/certified-operators-pm6k2" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.340243 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c19d7570-8aa3-4699-8824-0f800d93504c-utilities\") pod \"certified-operators-pm6k2\" (UID: \"c19d7570-8aa3-4699-8824-0f800d93504c\") " pod="openshift-marketplace/certified-operators-pm6k2" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.340450 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c19d7570-8aa3-4699-8824-0f800d93504c-catalog-content\") pod \"certified-operators-pm6k2\" (UID: \"c19d7570-8aa3-4699-8824-0f800d93504c\") " pod="openshift-marketplace/certified-operators-pm6k2" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.362002 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gftxx\" (UniqueName: \"kubernetes.io/projected/c19d7570-8aa3-4699-8824-0f800d93504c-kube-api-access-gftxx\") pod \"certified-operators-pm6k2\" (UID: \"c19d7570-8aa3-4699-8824-0f800d93504c\") " pod="openshift-marketplace/certified-operators-pm6k2" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.422053 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pm6k2" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.440948 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae0eeff2-3ae3-49da-97b3-55815d2a92c4-utilities\") pod \"ae0eeff2-3ae3-49da-97b3-55815d2a92c4\" (UID: \"ae0eeff2-3ae3-49da-97b3-55815d2a92c4\") " Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.441029 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae0eeff2-3ae3-49da-97b3-55815d2a92c4-catalog-content\") pod \"ae0eeff2-3ae3-49da-97b3-55815d2a92c4\" (UID: \"ae0eeff2-3ae3-49da-97b3-55815d2a92c4\") " Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.441286 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg55d\" (UniqueName: \"kubernetes.io/projected/ae0eeff2-3ae3-49da-97b3-55815d2a92c4-kube-api-access-bg55d\") pod \"ae0eeff2-3ae3-49da-97b3-55815d2a92c4\" (UID: \"ae0eeff2-3ae3-49da-97b3-55815d2a92c4\") " Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.442162 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae0eeff2-3ae3-49da-97b3-55815d2a92c4-utilities" (OuterVolumeSpecName: "utilities") pod "ae0eeff2-3ae3-49da-97b3-55815d2a92c4" (UID: "ae0eeff2-3ae3-49da-97b3-55815d2a92c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.447545 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae0eeff2-3ae3-49da-97b3-55815d2a92c4-kube-api-access-bg55d" (OuterVolumeSpecName: "kube-api-access-bg55d") pod "ae0eeff2-3ae3-49da-97b3-55815d2a92c4" (UID: "ae0eeff2-3ae3-49da-97b3-55815d2a92c4"). InnerVolumeSpecName "kube-api-access-bg55d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.543000 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae0eeff2-3ae3-49da-97b3-55815d2a92c4-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.543036 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg55d\" (UniqueName: \"kubernetes.io/projected/ae0eeff2-3ae3-49da-97b3-55815d2a92c4-kube-api-access-bg55d\") on node \"crc\" DevicePath \"\"" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.775388 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae0eeff2-3ae3-49da-97b3-55815d2a92c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae0eeff2-3ae3-49da-97b3-55815d2a92c4" (UID: "ae0eeff2-3ae3-49da-97b3-55815d2a92c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.807911 4923 generic.go:334] "Generic (PLEG): container finished" podID="ae0eeff2-3ae3-49da-97b3-55815d2a92c4" containerID="10b517b7f3188ed8eb019bbc8c2b2568ff9a3862be04924c09890a4611893c66" exitCode=0 Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.807954 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bl6bp" event={"ID":"ae0eeff2-3ae3-49da-97b3-55815d2a92c4","Type":"ContainerDied","Data":"10b517b7f3188ed8eb019bbc8c2b2568ff9a3862be04924c09890a4611893c66"} Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.807998 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bl6bp" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.808021 4923 scope.go:117] "RemoveContainer" containerID="10b517b7f3188ed8eb019bbc8c2b2568ff9a3862be04924c09890a4611893c66" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.808007 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bl6bp" event={"ID":"ae0eeff2-3ae3-49da-97b3-55815d2a92c4","Type":"ContainerDied","Data":"256aa79024fbe76c864d015056ab4c63826a98dfa12b0b1a1eb9e2d5bd0d7773"} Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.847941 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae0eeff2-3ae3-49da-97b3-55815d2a92c4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.857442 4923 scope.go:117] "RemoveContainer" containerID="51f120bb496be7e7edfa0b8983d299ff77f977625a2de1e83abf3ca60c5df70a" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.858834 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bl6bp"] Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.870782 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bl6bp"] Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.880077 4923 scope.go:117] "RemoveContainer" containerID="65712961d3f364e69115efc5c2c2be2364bc741e44ccc636e01031d2004d2a74" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.897206 4923 scope.go:117] "RemoveContainer" containerID="10b517b7f3188ed8eb019bbc8c2b2568ff9a3862be04924c09890a4611893c66" Feb 24 03:29:08 crc kubenswrapper[4923]: E0224 03:29:08.897516 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10b517b7f3188ed8eb019bbc8c2b2568ff9a3862be04924c09890a4611893c66\": container with ID starting with 10b517b7f3188ed8eb019bbc8c2b2568ff9a3862be04924c09890a4611893c66 not found: ID does not exist" containerID="10b517b7f3188ed8eb019bbc8c2b2568ff9a3862be04924c09890a4611893c66" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.897556 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b517b7f3188ed8eb019bbc8c2b2568ff9a3862be04924c09890a4611893c66"} err="failed to get container status \"10b517b7f3188ed8eb019bbc8c2b2568ff9a3862be04924c09890a4611893c66\": rpc error: code = NotFound desc = could not find container \"10b517b7f3188ed8eb019bbc8c2b2568ff9a3862be04924c09890a4611893c66\": container with ID starting with 10b517b7f3188ed8eb019bbc8c2b2568ff9a3862be04924c09890a4611893c66 not found: ID does not exist" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.897585 4923 scope.go:117] "RemoveContainer" containerID="51f120bb496be7e7edfa0b8983d299ff77f977625a2de1e83abf3ca60c5df70a" Feb 24 03:29:08 crc kubenswrapper[4923]: E0224 03:29:08.897898 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51f120bb496be7e7edfa0b8983d299ff77f977625a2de1e83abf3ca60c5df70a\": container with ID starting with 51f120bb496be7e7edfa0b8983d299ff77f977625a2de1e83abf3ca60c5df70a not found: ID does not exist" containerID="51f120bb496be7e7edfa0b8983d299ff77f977625a2de1e83abf3ca60c5df70a" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.897929 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f120bb496be7e7edfa0b8983d299ff77f977625a2de1e83abf3ca60c5df70a"} err="failed to get container status \"51f120bb496be7e7edfa0b8983d299ff77f977625a2de1e83abf3ca60c5df70a\": rpc error: code = NotFound desc = could not find container \"51f120bb496be7e7edfa0b8983d299ff77f977625a2de1e83abf3ca60c5df70a\": container with ID starting with 51f120bb496be7e7edfa0b8983d299ff77f977625a2de1e83abf3ca60c5df70a not found: ID does not exist" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.897948 4923 scope.go:117] "RemoveContainer" containerID="65712961d3f364e69115efc5c2c2be2364bc741e44ccc636e01031d2004d2a74" Feb 24 03:29:08 crc kubenswrapper[4923]: E0224 03:29:08.898350 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65712961d3f364e69115efc5c2c2be2364bc741e44ccc636e01031d2004d2a74\": container with ID starting with 65712961d3f364e69115efc5c2c2be2364bc741e44ccc636e01031d2004d2a74 not found: ID does not exist" containerID="65712961d3f364e69115efc5c2c2be2364bc741e44ccc636e01031d2004d2a74" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.898627 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65712961d3f364e69115efc5c2c2be2364bc741e44ccc636e01031d2004d2a74"} err="failed to get container status \"65712961d3f364e69115efc5c2c2be2364bc741e44ccc636e01031d2004d2a74\": rpc error: code = NotFound desc = could not find container \"65712961d3f364e69115efc5c2c2be2364bc741e44ccc636e01031d2004d2a74\": container with ID starting with 65712961d3f364e69115efc5c2c2be2364bc741e44ccc636e01031d2004d2a74 not found: ID does not exist" Feb 24 03:29:08 crc kubenswrapper[4923]: I0224 03:29:08.947843 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pm6k2"] Feb 24 03:29:09 crc kubenswrapper[4923]: I0224 03:29:09.713874 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:29:09 crc kubenswrapper[4923]: E0224 03:29:09.714436 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:29:09 crc kubenswrapper[4923]: I0224 03:29:09.728291 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae0eeff2-3ae3-49da-97b3-55815d2a92c4" path="/var/lib/kubelet/pods/ae0eeff2-3ae3-49da-97b3-55815d2a92c4/volumes" Feb 24 03:29:09 crc kubenswrapper[4923]: I0224 03:29:09.828426 4923 generic.go:334] "Generic (PLEG): container finished" podID="c19d7570-8aa3-4699-8824-0f800d93504c" containerID="61e8573281b06a1e794a5817cfcad8b9a6e9023375af22f2213479a79a79f5bb" exitCode=0 Feb 24 03:29:09 crc kubenswrapper[4923]: I0224 03:29:09.828486 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm6k2" event={"ID":"c19d7570-8aa3-4699-8824-0f800d93504c","Type":"ContainerDied","Data":"61e8573281b06a1e794a5817cfcad8b9a6e9023375af22f2213479a79a79f5bb"} Feb 24 03:29:09 crc kubenswrapper[4923]: I0224 03:29:09.828524 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm6k2" event={"ID":"c19d7570-8aa3-4699-8824-0f800d93504c","Type":"ContainerStarted","Data":"2502f1d4b05804958c63188c420500e31de1da35c72c2c7c7c2a1d1c77c2f39f"} Feb 24 03:29:10 crc kubenswrapper[4923]: I0224 03:29:10.839092 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm6k2" event={"ID":"c19d7570-8aa3-4699-8824-0f800d93504c","Type":"ContainerStarted","Data":"e5cd4456aa2e2afc5053f6d2a157193f8fe186fd42596297a63a8e12f3beb483"} Feb 24 03:29:11 crc kubenswrapper[4923]: I0224 03:29:11.854912 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm6k2" event={"ID":"c19d7570-8aa3-4699-8824-0f800d93504c","Type":"ContainerDied","Data":"e5cd4456aa2e2afc5053f6d2a157193f8fe186fd42596297a63a8e12f3beb483"} Feb 24 03:29:11 crc kubenswrapper[4923]: I0224 03:29:11.855182 4923 generic.go:334] "Generic (PLEG): container finished" podID="c19d7570-8aa3-4699-8824-0f800d93504c" containerID="e5cd4456aa2e2afc5053f6d2a157193f8fe186fd42596297a63a8e12f3beb483" exitCode=0 Feb 24 03:29:12 crc kubenswrapper[4923]: I0224 03:29:12.867661 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm6k2" event={"ID":"c19d7570-8aa3-4699-8824-0f800d93504c","Type":"ContainerStarted","Data":"41728346a633961c75d3ed5913f9e412f782198368efc558ebc752e08ff1f369"} Feb 24 03:29:12 crc kubenswrapper[4923]: I0224 03:29:12.899776 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pm6k2" podStartSLOduration=2.427920764 podStartE2EDuration="4.899760123s" podCreationTimestamp="2026-02-24 03:29:08 +0000 UTC" firstStartedPulling="2026-02-24 03:29:09.831618081 +0000 UTC m=+2073.848688934" lastFinishedPulling="2026-02-24 03:29:12.30345748 +0000 UTC m=+2076.320528293" observedRunningTime="2026-02-24 03:29:12.894375993 +0000 UTC m=+2076.911446806" watchObservedRunningTime="2026-02-24 03:29:12.899760123 +0000 UTC m=+2076.916830936" Feb 24 03:29:14 crc kubenswrapper[4923]: I0224 03:29:14.890722 4923 generic.go:334] "Generic (PLEG): container finished" podID="a55af564-f005-452b-acb3-8fa3910b1485" containerID="1477d85067bfa23f4e73811ec9527de0902d862a1596509f1d3df6aaff5a9c1a" exitCode=0 Feb 24 03:29:14 crc kubenswrapper[4923]: I0224 03:29:14.890922 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" event={"ID":"a55af564-f005-452b-acb3-8fa3910b1485","Type":"ContainerDied","Data":"1477d85067bfa23f4e73811ec9527de0902d862a1596509f1d3df6aaff5a9c1a"} Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.377255 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.429475 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-neutron-metadata-combined-ca-bundle\") pod \"a55af564-f005-452b-acb3-8fa3910b1485\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.429548 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-nova-metadata-neutron-config-0\") pod \"a55af564-f005-452b-acb3-8fa3910b1485\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.429680 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-neutron-ovn-metadata-agent-neutron-config-0\") pod \"a55af564-f005-452b-acb3-8fa3910b1485\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.429763 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg5j7\" (UniqueName: \"kubernetes.io/projected/a55af564-f005-452b-acb3-8fa3910b1485-kube-api-access-sg5j7\") pod \"a55af564-f005-452b-acb3-8fa3910b1485\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.429805 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-inventory\") pod \"a55af564-f005-452b-acb3-8fa3910b1485\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.429863 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-ssh-key-openstack-edpm-ipam\") pod \"a55af564-f005-452b-acb3-8fa3910b1485\" (UID: \"a55af564-f005-452b-acb3-8fa3910b1485\") " Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.436633 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a55af564-f005-452b-acb3-8fa3910b1485-kube-api-access-sg5j7" (OuterVolumeSpecName: "kube-api-access-sg5j7") pod "a55af564-f005-452b-acb3-8fa3910b1485" (UID: "a55af564-f005-452b-acb3-8fa3910b1485"). InnerVolumeSpecName "kube-api-access-sg5j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.443560 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "a55af564-f005-452b-acb3-8fa3910b1485" (UID: "a55af564-f005-452b-acb3-8fa3910b1485"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.458880 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "a55af564-f005-452b-acb3-8fa3910b1485" (UID: "a55af564-f005-452b-acb3-8fa3910b1485"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.459495 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a55af564-f005-452b-acb3-8fa3910b1485" (UID: "a55af564-f005-452b-acb3-8fa3910b1485"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.463776 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-inventory" (OuterVolumeSpecName: "inventory") pod "a55af564-f005-452b-acb3-8fa3910b1485" (UID: "a55af564-f005-452b-acb3-8fa3910b1485"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.478399 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "a55af564-f005-452b-acb3-8fa3910b1485" (UID: "a55af564-f005-452b-acb3-8fa3910b1485"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.531632 4923 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.531912 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg5j7\" (UniqueName: \"kubernetes.io/projected/a55af564-f005-452b-acb3-8fa3910b1485-kube-api-access-sg5j7\") on node \"crc\" DevicePath \"\"" Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.531922 4923 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-inventory\") on node \"crc\" DevicePath \"\"" Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.531931 4923 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.531940 4923 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.531950 4923 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a55af564-f005-452b-acb3-8fa3910b1485-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.912127 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" event={"ID":"a55af564-f005-452b-acb3-8fa3910b1485","Type":"ContainerDied","Data":"1c0075b6f2d57e9e842efd1a2fcfa9b86324adb1c2ca933f82baa90f581ecdd3"} Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.912173 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c0075b6f2d57e9e842efd1a2fcfa9b86324adb1c2ca933f82baa90f581ecdd3" Feb 24 03:29:16 crc kubenswrapper[4923]: I0224 03:29:16.912175 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.040694 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn"] Feb 24 03:29:17 crc kubenswrapper[4923]: E0224 03:29:17.041110 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae0eeff2-3ae3-49da-97b3-55815d2a92c4" containerName="extract-utilities" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.041132 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae0eeff2-3ae3-49da-97b3-55815d2a92c4" containerName="extract-utilities" Feb 24 03:29:17 crc kubenswrapper[4923]: E0224 03:29:17.041151 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a55af564-f005-452b-acb3-8fa3910b1485" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.041162 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="a55af564-f005-452b-acb3-8fa3910b1485" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 24 03:29:17 crc kubenswrapper[4923]: E0224 03:29:17.041178 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae0eeff2-3ae3-49da-97b3-55815d2a92c4" containerName="extract-content" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.041186 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae0eeff2-3ae3-49da-97b3-55815d2a92c4" containerName="extract-content" Feb 24 03:29:17 crc kubenswrapper[4923]: E0224 03:29:17.041204 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae0eeff2-3ae3-49da-97b3-55815d2a92c4" containerName="registry-server" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.041211 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae0eeff2-3ae3-49da-97b3-55815d2a92c4" containerName="registry-server" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.041461 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae0eeff2-3ae3-49da-97b3-55815d2a92c4" containerName="registry-server" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.041484 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="a55af564-f005-452b-acb3-8fa3910b1485" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.042209 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.049729 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.049948 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.050139 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.050430 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.050598 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fgpt8" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.053092 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn"] Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.241388 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.241548 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rgt5\" (UniqueName: \"kubernetes.io/projected/aca2992d-fbda-4dad-8ab4-02147a40ed9e-kube-api-access-6rgt5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.241605 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.241883 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.241946 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.343934 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.344011 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rgt5\" (UniqueName: \"kubernetes.io/projected/aca2992d-fbda-4dad-8ab4-02147a40ed9e-kube-api-access-6rgt5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.344048 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.344152 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.344184 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.350396 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.351267 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.352064 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.359333 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.380341 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rgt5\" (UniqueName: \"kubernetes.io/projected/aca2992d-fbda-4dad-8ab4-02147a40ed9e-kube-api-access-6rgt5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:29:17 crc kubenswrapper[4923]: I0224 03:29:17.660546 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:29:18 crc kubenswrapper[4923]: I0224 03:29:18.029746 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn"] Feb 24 03:29:18 crc kubenswrapper[4923]: I0224 03:29:18.422605 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pm6k2" Feb 24 03:29:18 crc kubenswrapper[4923]: I0224 03:29:18.422671 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pm6k2" Feb 24 03:29:18 crc kubenswrapper[4923]: I0224 03:29:18.500660 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pm6k2" Feb 24 03:29:18 crc kubenswrapper[4923]: I0224 03:29:18.942447 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" event={"ID":"aca2992d-fbda-4dad-8ab4-02147a40ed9e","Type":"ContainerStarted","Data":"6d372d65aa58c8e32b56edc53bf5192282038b75289981e8a5e6f1e900d6252d"} Feb 24 03:29:18 crc kubenswrapper[4923]: I0224 03:29:18.942507 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" event={"ID":"aca2992d-fbda-4dad-8ab4-02147a40ed9e","Type":"ContainerStarted","Data":"cb489f6b82652b3c9e335c014e0950122e6068887374607cdd66e2526087e968"} Feb 24 03:29:18 crc kubenswrapper[4923]: I0224 03:29:18.964351 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" podStartSLOduration=1.576189796 podStartE2EDuration="1.964330207s" podCreationTimestamp="2026-02-24 03:29:17 +0000 UTC" firstStartedPulling="2026-02-24 03:29:18.039838986 +0000 UTC m=+2082.056909799" lastFinishedPulling="2026-02-24 03:29:18.427979387 +0000 UTC m=+2082.445050210" observedRunningTime="2026-02-24 03:29:18.961538284 +0000 UTC m=+2082.978609127" watchObservedRunningTime="2026-02-24 03:29:18.964330207 +0000 UTC m=+2082.981401020" Feb 24 03:29:18 crc kubenswrapper[4923]: I0224 03:29:18.997505 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pm6k2" Feb 24 03:29:19 crc kubenswrapper[4923]: I0224 03:29:19.050403 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pm6k2"] Feb 24 03:29:20 crc kubenswrapper[4923]: I0224 03:29:20.713277 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:29:20 crc kubenswrapper[4923]: I0224 03:29:20.964213 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pm6k2" podUID="c19d7570-8aa3-4699-8824-0f800d93504c" containerName="registry-server" containerID="cri-o://41728346a633961c75d3ed5913f9e412f782198368efc558ebc752e08ff1f369" gracePeriod=2 Feb 24 03:29:21 crc kubenswrapper[4923]: I0224 03:29:21.456784 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pm6k2" Feb 24 03:29:21 crc kubenswrapper[4923]: I0224 03:29:21.627097 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c19d7570-8aa3-4699-8824-0f800d93504c-utilities\") pod \"c19d7570-8aa3-4699-8824-0f800d93504c\" (UID: \"c19d7570-8aa3-4699-8824-0f800d93504c\") " Feb 24 03:29:21 crc kubenswrapper[4923]: I0224 03:29:21.627207 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gftxx\" (UniqueName: \"kubernetes.io/projected/c19d7570-8aa3-4699-8824-0f800d93504c-kube-api-access-gftxx\") pod \"c19d7570-8aa3-4699-8824-0f800d93504c\" (UID: \"c19d7570-8aa3-4699-8824-0f800d93504c\") " Feb 24 03:29:21 crc kubenswrapper[4923]: I0224 03:29:21.627262 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c19d7570-8aa3-4699-8824-0f800d93504c-catalog-content\") pod \"c19d7570-8aa3-4699-8824-0f800d93504c\" (UID: \"c19d7570-8aa3-4699-8824-0f800d93504c\") " Feb 24 03:29:21 crc kubenswrapper[4923]: I0224 03:29:21.628852 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c19d7570-8aa3-4699-8824-0f800d93504c-utilities" (OuterVolumeSpecName: "utilities") pod "c19d7570-8aa3-4699-8824-0f800d93504c" (UID: "c19d7570-8aa3-4699-8824-0f800d93504c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:29:21 crc kubenswrapper[4923]: I0224 03:29:21.639514 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c19d7570-8aa3-4699-8824-0f800d93504c-kube-api-access-gftxx" (OuterVolumeSpecName: "kube-api-access-gftxx") pod "c19d7570-8aa3-4699-8824-0f800d93504c" (UID: "c19d7570-8aa3-4699-8824-0f800d93504c"). InnerVolumeSpecName "kube-api-access-gftxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:29:21 crc kubenswrapper[4923]: I0224 03:29:21.729833 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gftxx\" (UniqueName: \"kubernetes.io/projected/c19d7570-8aa3-4699-8824-0f800d93504c-kube-api-access-gftxx\") on node \"crc\" DevicePath \"\"" Feb 24 03:29:21 crc kubenswrapper[4923]: I0224 03:29:21.729862 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c19d7570-8aa3-4699-8824-0f800d93504c-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:29:21 crc kubenswrapper[4923]: I0224 03:29:21.977073 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerStarted","Data":"960084f70366f4433330b126ef123c279f963df8e18f6dfbc42d695fdb9ed015"} Feb 24 03:29:21 crc kubenswrapper[4923]: I0224 03:29:21.982072 4923 generic.go:334] "Generic (PLEG): container finished" podID="c19d7570-8aa3-4699-8824-0f800d93504c" containerID="41728346a633961c75d3ed5913f9e412f782198368efc558ebc752e08ff1f369" exitCode=0 Feb 24 03:29:21 crc kubenswrapper[4923]: I0224 03:29:21.982120 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm6k2" event={"ID":"c19d7570-8aa3-4699-8824-0f800d93504c","Type":"ContainerDied","Data":"41728346a633961c75d3ed5913f9e412f782198368efc558ebc752e08ff1f369"} Feb 24 03:29:21 crc kubenswrapper[4923]: I0224 03:29:21.982159 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm6k2" event={"ID":"c19d7570-8aa3-4699-8824-0f800d93504c","Type":"ContainerDied","Data":"2502f1d4b05804958c63188c420500e31de1da35c72c2c7c7c2a1d1c77c2f39f"} Feb 24 03:29:21 crc kubenswrapper[4923]: I0224 03:29:21.982195 4923 scope.go:117] "RemoveContainer" containerID="41728346a633961c75d3ed5913f9e412f782198368efc558ebc752e08ff1f369" Feb 24 03:29:21 crc kubenswrapper[4923]: I0224 03:29:21.982240 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pm6k2" Feb 24 03:29:22 crc kubenswrapper[4923]: I0224 03:29:22.038142 4923 scope.go:117] "RemoveContainer" containerID="e5cd4456aa2e2afc5053f6d2a157193f8fe186fd42596297a63a8e12f3beb483" Feb 24 03:29:22 crc kubenswrapper[4923]: I0224 03:29:22.062718 4923 scope.go:117] "RemoveContainer" containerID="61e8573281b06a1e794a5817cfcad8b9a6e9023375af22f2213479a79a79f5bb" Feb 24 03:29:22 crc kubenswrapper[4923]: I0224 03:29:22.131590 4923 scope.go:117] "RemoveContainer" containerID="41728346a633961c75d3ed5913f9e412f782198368efc558ebc752e08ff1f369" Feb 24 03:29:22 crc kubenswrapper[4923]: E0224 03:29:22.132166 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41728346a633961c75d3ed5913f9e412f782198368efc558ebc752e08ff1f369\": container with ID starting with 41728346a633961c75d3ed5913f9e412f782198368efc558ebc752e08ff1f369 not found: ID does not exist" containerID="41728346a633961c75d3ed5913f9e412f782198368efc558ebc752e08ff1f369" Feb 24 03:29:22 crc kubenswrapper[4923]: I0224 03:29:22.132261 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41728346a633961c75d3ed5913f9e412f782198368efc558ebc752e08ff1f369"} err="failed to get container status \"41728346a633961c75d3ed5913f9e412f782198368efc558ebc752e08ff1f369\": rpc error: code = NotFound desc = could not find container \"41728346a633961c75d3ed5913f9e412f782198368efc558ebc752e08ff1f369\": container with ID starting with 41728346a633961c75d3ed5913f9e412f782198368efc558ebc752e08ff1f369 not found: ID does not exist" Feb 24 03:29:22 crc kubenswrapper[4923]: I0224 03:29:22.132343 4923 scope.go:117] "RemoveContainer" containerID="e5cd4456aa2e2afc5053f6d2a157193f8fe186fd42596297a63a8e12f3beb483" Feb 24 03:29:22 crc kubenswrapper[4923]: E0224 03:29:22.132908 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5cd4456aa2e2afc5053f6d2a157193f8fe186fd42596297a63a8e12f3beb483\": container with ID starting with e5cd4456aa2e2afc5053f6d2a157193f8fe186fd42596297a63a8e12f3beb483 not found: ID does not exist" containerID="e5cd4456aa2e2afc5053f6d2a157193f8fe186fd42596297a63a8e12f3beb483" Feb 24 03:29:22 crc kubenswrapper[4923]: I0224 03:29:22.132951 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5cd4456aa2e2afc5053f6d2a157193f8fe186fd42596297a63a8e12f3beb483"} err="failed to get container status \"e5cd4456aa2e2afc5053f6d2a157193f8fe186fd42596297a63a8e12f3beb483\": rpc error: code = NotFound desc = could not find container \"e5cd4456aa2e2afc5053f6d2a157193f8fe186fd42596297a63a8e12f3beb483\": container with ID starting with e5cd4456aa2e2afc5053f6d2a157193f8fe186fd42596297a63a8e12f3beb483 not found: ID does not exist" Feb 24 03:29:22 crc kubenswrapper[4923]: I0224 03:29:22.132980 4923 scope.go:117] "RemoveContainer" containerID="61e8573281b06a1e794a5817cfcad8b9a6e9023375af22f2213479a79a79f5bb" Feb 24 03:29:22 crc kubenswrapper[4923]: E0224 03:29:22.133285 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61e8573281b06a1e794a5817cfcad8b9a6e9023375af22f2213479a79a79f5bb\": container with ID starting with 61e8573281b06a1e794a5817cfcad8b9a6e9023375af22f2213479a79a79f5bb not found: ID does not exist" containerID="61e8573281b06a1e794a5817cfcad8b9a6e9023375af22f2213479a79a79f5bb" Feb 24 03:29:22 crc kubenswrapper[4923]: I0224 03:29:22.133340 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61e8573281b06a1e794a5817cfcad8b9a6e9023375af22f2213479a79a79f5bb"} err="failed to get container status \"61e8573281b06a1e794a5817cfcad8b9a6e9023375af22f2213479a79a79f5bb\": rpc error: code = NotFound desc = could not find container \"61e8573281b06a1e794a5817cfcad8b9a6e9023375af22f2213479a79a79f5bb\": container with ID starting with 61e8573281b06a1e794a5817cfcad8b9a6e9023375af22f2213479a79a79f5bb not found: ID does not exist" Feb 24 03:29:22 crc kubenswrapper[4923]: I0224 03:29:22.253462 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c19d7570-8aa3-4699-8824-0f800d93504c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c19d7570-8aa3-4699-8824-0f800d93504c" (UID: "c19d7570-8aa3-4699-8824-0f800d93504c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:29:22 crc kubenswrapper[4923]: I0224 03:29:22.341831 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pm6k2"] Feb 24 03:29:22 crc kubenswrapper[4923]: I0224 03:29:22.348388 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c19d7570-8aa3-4699-8824-0f800d93504c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:29:22 crc kubenswrapper[4923]: I0224 03:29:22.352507 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pm6k2"] Feb 24 03:29:23 crc kubenswrapper[4923]: I0224 03:29:23.725573 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c19d7570-8aa3-4699-8824-0f800d93504c" path="/var/lib/kubelet/pods/c19d7570-8aa3-4699-8824-0f800d93504c/volumes" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.150847 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6"] Feb 24 03:30:00 crc kubenswrapper[4923]: E0224 03:30:00.151845 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c19d7570-8aa3-4699-8824-0f800d93504c" containerName="extract-content" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.151861 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19d7570-8aa3-4699-8824-0f800d93504c" containerName="extract-content" Feb 24 03:30:00 crc kubenswrapper[4923]: E0224 03:30:00.151881 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c19d7570-8aa3-4699-8824-0f800d93504c" containerName="extract-utilities" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.151890 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19d7570-8aa3-4699-8824-0f800d93504c" containerName="extract-utilities" Feb 24 03:30:00 crc kubenswrapper[4923]: E0224 03:30:00.151915 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c19d7570-8aa3-4699-8824-0f800d93504c" containerName="registry-server" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.151922 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19d7570-8aa3-4699-8824-0f800d93504c" containerName="registry-server" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.152102 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="c19d7570-8aa3-4699-8824-0f800d93504c" containerName="registry-server" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.152795 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.157750 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.158000 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.164101 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6"] Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.218724 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/15f82de1-80a2-460d-af7a-b420486b0547-secret-volume\") pod \"collect-profiles-29531730-cljs6\" (UID: \"15f82de1-80a2-460d-af7a-b420486b0547\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.218842 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15f82de1-80a2-460d-af7a-b420486b0547-config-volume\") pod \"collect-profiles-29531730-cljs6\" (UID: \"15f82de1-80a2-460d-af7a-b420486b0547\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.218903 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj5kt\" (UniqueName: \"kubernetes.io/projected/15f82de1-80a2-460d-af7a-b420486b0547-kube-api-access-vj5kt\") pod \"collect-profiles-29531730-cljs6\" (UID: \"15f82de1-80a2-460d-af7a-b420486b0547\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.321133 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/15f82de1-80a2-460d-af7a-b420486b0547-secret-volume\") pod \"collect-profiles-29531730-cljs6\" (UID: \"15f82de1-80a2-460d-af7a-b420486b0547\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.321275 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15f82de1-80a2-460d-af7a-b420486b0547-config-volume\") pod \"collect-profiles-29531730-cljs6\" (UID: \"15f82de1-80a2-460d-af7a-b420486b0547\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.321438 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj5kt\" (UniqueName: \"kubernetes.io/projected/15f82de1-80a2-460d-af7a-b420486b0547-kube-api-access-vj5kt\") pod \"collect-profiles-29531730-cljs6\" (UID: \"15f82de1-80a2-460d-af7a-b420486b0547\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.322328 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15f82de1-80a2-460d-af7a-b420486b0547-config-volume\") pod \"collect-profiles-29531730-cljs6\" (UID: \"15f82de1-80a2-460d-af7a-b420486b0547\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.327050 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/15f82de1-80a2-460d-af7a-b420486b0547-secret-volume\") pod \"collect-profiles-29531730-cljs6\" (UID: \"15f82de1-80a2-460d-af7a-b420486b0547\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.336978 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj5kt\" (UniqueName: \"kubernetes.io/projected/15f82de1-80a2-460d-af7a-b420486b0547-kube-api-access-vj5kt\") pod \"collect-profiles-29531730-cljs6\" (UID: \"15f82de1-80a2-460d-af7a-b420486b0547\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.478962 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6" Feb 24 03:30:00 crc kubenswrapper[4923]: I0224 03:30:00.924958 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6"] Feb 24 03:30:01 crc kubenswrapper[4923]: I0224 03:30:01.400270 4923 generic.go:334] "Generic (PLEG): container finished" podID="15f82de1-80a2-460d-af7a-b420486b0547" containerID="cc1b2f6ccf645bae53e98017c6e6586f2223ddf2ad9a978b4df65dedb30b78c3" exitCode=0 Feb 24 03:30:01 crc kubenswrapper[4923]: I0224 03:30:01.400338 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6" event={"ID":"15f82de1-80a2-460d-af7a-b420486b0547","Type":"ContainerDied","Data":"cc1b2f6ccf645bae53e98017c6e6586f2223ddf2ad9a978b4df65dedb30b78c3"} Feb 24 03:30:01 crc kubenswrapper[4923]: I0224 03:30:01.400392 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6" event={"ID":"15f82de1-80a2-460d-af7a-b420486b0547","Type":"ContainerStarted","Data":"8265cfb38be3fc9f99d8eadeb396e11eb5a687e92070b41f83e7f956e32d371f"} Feb 24 03:30:02 crc kubenswrapper[4923]: I0224 03:30:02.750913 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6" Feb 24 03:30:02 crc kubenswrapper[4923]: I0224 03:30:02.926105 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj5kt\" (UniqueName: \"kubernetes.io/projected/15f82de1-80a2-460d-af7a-b420486b0547-kube-api-access-vj5kt\") pod \"15f82de1-80a2-460d-af7a-b420486b0547\" (UID: \"15f82de1-80a2-460d-af7a-b420486b0547\") " Feb 24 03:30:02 crc kubenswrapper[4923]: I0224 03:30:02.926161 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/15f82de1-80a2-460d-af7a-b420486b0547-secret-volume\") pod \"15f82de1-80a2-460d-af7a-b420486b0547\" (UID: \"15f82de1-80a2-460d-af7a-b420486b0547\") " Feb 24 03:30:02 crc kubenswrapper[4923]: I0224 03:30:02.926206 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15f82de1-80a2-460d-af7a-b420486b0547-config-volume\") pod \"15f82de1-80a2-460d-af7a-b420486b0547\" (UID: \"15f82de1-80a2-460d-af7a-b420486b0547\") " Feb 24 03:30:02 crc kubenswrapper[4923]: I0224 03:30:02.927314 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15f82de1-80a2-460d-af7a-b420486b0547-config-volume" (OuterVolumeSpecName: "config-volume") pod "15f82de1-80a2-460d-af7a-b420486b0547" (UID: "15f82de1-80a2-460d-af7a-b420486b0547"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:30:02 crc kubenswrapper[4923]: I0224 03:30:02.932779 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15f82de1-80a2-460d-af7a-b420486b0547-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "15f82de1-80a2-460d-af7a-b420486b0547" (UID: "15f82de1-80a2-460d-af7a-b420486b0547"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:30:02 crc kubenswrapper[4923]: I0224 03:30:02.933119 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15f82de1-80a2-460d-af7a-b420486b0547-kube-api-access-vj5kt" (OuterVolumeSpecName: "kube-api-access-vj5kt") pod "15f82de1-80a2-460d-af7a-b420486b0547" (UID: "15f82de1-80a2-460d-af7a-b420486b0547"). InnerVolumeSpecName "kube-api-access-vj5kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:30:03 crc kubenswrapper[4923]: I0224 03:30:03.028360 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj5kt\" (UniqueName: \"kubernetes.io/projected/15f82de1-80a2-460d-af7a-b420486b0547-kube-api-access-vj5kt\") on node \"crc\" DevicePath \"\"" Feb 24 03:30:03 crc kubenswrapper[4923]: I0224 03:30:03.028393 4923 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/15f82de1-80a2-460d-af7a-b420486b0547-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 24 03:30:03 crc kubenswrapper[4923]: I0224 03:30:03.028407 4923 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15f82de1-80a2-460d-af7a-b420486b0547-config-volume\") on node \"crc\" DevicePath \"\"" Feb 24 03:30:03 crc kubenswrapper[4923]: I0224 03:30:03.419029 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6" event={"ID":"15f82de1-80a2-460d-af7a-b420486b0547","Type":"ContainerDied","Data":"8265cfb38be3fc9f99d8eadeb396e11eb5a687e92070b41f83e7f956e32d371f"} Feb 24 03:30:03 crc kubenswrapper[4923]: I0224 03:30:03.419072 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8265cfb38be3fc9f99d8eadeb396e11eb5a687e92070b41f83e7f956e32d371f" Feb 24 03:30:03 crc kubenswrapper[4923]: I0224 03:30:03.419356 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531730-cljs6" Feb 24 03:30:03 crc kubenswrapper[4923]: I0224 03:30:03.820442 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc"] Feb 24 03:30:03 crc kubenswrapper[4923]: I0224 03:30:03.827514 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531685-dg9vc"] Feb 24 03:30:05 crc kubenswrapper[4923]: I0224 03:30:05.730695 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a99e06cc-b200-4073-a847-410f9799eb3a" path="/var/lib/kubelet/pods/a99e06cc-b200-4073-a847-410f9799eb3a/volumes" Feb 24 03:30:44 crc kubenswrapper[4923]: I0224 03:30:44.145195 4923 scope.go:117] "RemoveContainer" containerID="a76c0488461ed909bb928cc0aadfa91454b9c0426b9fb3c1fef88d255d4142a4" Feb 24 03:31:49 crc kubenswrapper[4923]: I0224 03:31:49.916392 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:31:49 crc kubenswrapper[4923]: I0224 03:31:49.917024 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:32:19 crc kubenswrapper[4923]: I0224 03:32:19.916978 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:32:19 crc kubenswrapper[4923]: I0224 03:32:19.917570 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:32:49 crc kubenswrapper[4923]: I0224 03:32:49.916582 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:32:49 crc kubenswrapper[4923]: I0224 03:32:49.917461 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:32:49 crc kubenswrapper[4923]: I0224 03:32:49.917525 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 03:32:49 crc kubenswrapper[4923]: I0224 03:32:49.918566 4923 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"960084f70366f4433330b126ef123c279f963df8e18f6dfbc42d695fdb9ed015"} pod="openshift-machine-config-operator/machine-config-daemon-rh26t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 03:32:49 crc kubenswrapper[4923]: I0224 03:32:49.918664 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" containerID="cri-o://960084f70366f4433330b126ef123c279f963df8e18f6dfbc42d695fdb9ed015" gracePeriod=600 Feb 24 03:32:50 crc kubenswrapper[4923]: I0224 03:32:50.101274 4923 generic.go:334] "Generic (PLEG): container finished" podID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerID="960084f70366f4433330b126ef123c279f963df8e18f6dfbc42d695fdb9ed015" exitCode=0 Feb 24 03:32:50 crc kubenswrapper[4923]: I0224 03:32:50.101371 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerDied","Data":"960084f70366f4433330b126ef123c279f963df8e18f6dfbc42d695fdb9ed015"} Feb 24 03:32:50 crc kubenswrapper[4923]: I0224 03:32:50.101476 4923 scope.go:117] "RemoveContainer" containerID="e17f8e293cf35c7bb2e75dcc7b7c94eeba304b426270a51e34948b765974fda6" Feb 24 03:32:51 crc kubenswrapper[4923]: I0224 03:32:51.117056 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerStarted","Data":"2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656"} Feb 24 03:33:20 crc kubenswrapper[4923]: I0224 03:33:20.435661 4923 generic.go:334] "Generic (PLEG): container finished" podID="aca2992d-fbda-4dad-8ab4-02147a40ed9e" containerID="6d372d65aa58c8e32b56edc53bf5192282038b75289981e8a5e6f1e900d6252d" exitCode=0 Feb 24 03:33:20 crc kubenswrapper[4923]: I0224 03:33:20.435834 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" event={"ID":"aca2992d-fbda-4dad-8ab4-02147a40ed9e","Type":"ContainerDied","Data":"6d372d65aa58c8e32b56edc53bf5192282038b75289981e8a5e6f1e900d6252d"} Feb 24 03:33:21 crc kubenswrapper[4923]: I0224 03:33:21.920724 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.006446 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-libvirt-combined-ca-bundle\") pod \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.006663 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rgt5\" (UniqueName: \"kubernetes.io/projected/aca2992d-fbda-4dad-8ab4-02147a40ed9e-kube-api-access-6rgt5\") pod \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.006920 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-inventory\") pod \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.006995 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-libvirt-secret-0\") pod \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.007037 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-ssh-key-openstack-edpm-ipam\") pod \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\" (UID: \"aca2992d-fbda-4dad-8ab4-02147a40ed9e\") " Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.016092 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca2992d-fbda-4dad-8ab4-02147a40ed9e-kube-api-access-6rgt5" (OuterVolumeSpecName: "kube-api-access-6rgt5") pod "aca2992d-fbda-4dad-8ab4-02147a40ed9e" (UID: "aca2992d-fbda-4dad-8ab4-02147a40ed9e"). InnerVolumeSpecName "kube-api-access-6rgt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.020198 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "aca2992d-fbda-4dad-8ab4-02147a40ed9e" (UID: "aca2992d-fbda-4dad-8ab4-02147a40ed9e"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.047698 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-inventory" (OuterVolumeSpecName: "inventory") pod "aca2992d-fbda-4dad-8ab4-02147a40ed9e" (UID: "aca2992d-fbda-4dad-8ab4-02147a40ed9e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.048371 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "aca2992d-fbda-4dad-8ab4-02147a40ed9e" (UID: "aca2992d-fbda-4dad-8ab4-02147a40ed9e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.067138 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "aca2992d-fbda-4dad-8ab4-02147a40ed9e" (UID: "aca2992d-fbda-4dad-8ab4-02147a40ed9e"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.110100 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rgt5\" (UniqueName: \"kubernetes.io/projected/aca2992d-fbda-4dad-8ab4-02147a40ed9e-kube-api-access-6rgt5\") on node \"crc\" DevicePath \"\"" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.110151 4923 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-inventory\") on node \"crc\" DevicePath \"\"" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.110168 4923 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.110184 4923 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.110200 4923 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca2992d-fbda-4dad-8ab4-02147a40ed9e-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.458020 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" event={"ID":"aca2992d-fbda-4dad-8ab4-02147a40ed9e","Type":"ContainerDied","Data":"cb489f6b82652b3c9e335c014e0950122e6068887374607cdd66e2526087e968"} Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.458354 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb489f6b82652b3c9e335c014e0950122e6068887374607cdd66e2526087e968" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.458604 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.575020 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h"] Feb 24 03:33:22 crc kubenswrapper[4923]: E0224 03:33:22.577107 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f82de1-80a2-460d-af7a-b420486b0547" containerName="collect-profiles" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.577122 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f82de1-80a2-460d-af7a-b420486b0547" containerName="collect-profiles" Feb 24 03:33:22 crc kubenswrapper[4923]: E0224 03:33:22.577145 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca2992d-fbda-4dad-8ab4-02147a40ed9e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.577152 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca2992d-fbda-4dad-8ab4-02147a40ed9e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.577345 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="15f82de1-80a2-460d-af7a-b420486b0547" containerName="collect-profiles" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.577356 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="aca2992d-fbda-4dad-8ab4-02147a40ed9e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.578105 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.581062 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fgpt8" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.581405 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.581650 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.582233 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.582467 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.582678 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.583014 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.588886 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h"] Feb 24 03:33:22 crc kubenswrapper[4923]: E0224 03:33:22.647653 4923 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaca2992d_fbda_4dad_8ab4_02147a40ed9e.slice/crio-cb489f6b82652b3c9e335c014e0950122e6068887374607cdd66e2526087e968\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaca2992d_fbda_4dad_8ab4_02147a40ed9e.slice\": RecentStats: unable to find data in memory cache]" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.719972 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn2bs\" (UniqueName: \"kubernetes.io/projected/08f21e51-2e83-4b47-b794-1e3a2358381d-kube-api-access-pn2bs\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.720020 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.720051 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.720085 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.720116 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.720171 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.720203 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.720220 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.720244 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.720260 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.720278 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.824979 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn2bs\" (UniqueName: \"kubernetes.io/projected/08f21e51-2e83-4b47-b794-1e3a2358381d-kube-api-access-pn2bs\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.825050 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.825088 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.825128 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.825185 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.825254 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.825326 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.825353 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.825395 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.825421 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.825486 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.826984 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.831675 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.831799 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.832034 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.832764 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.833018 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.833516 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.834538 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.834710 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.835060 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.840589 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn2bs\" (UniqueName: \"kubernetes.io/projected/08f21e51-2e83-4b47-b794-1e3a2358381d-kube-api-access-pn2bs\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rfz2h\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:22 crc kubenswrapper[4923]: I0224 03:33:22.910568 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:33:23 crc kubenswrapper[4923]: I0224 03:33:23.426847 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h"] Feb 24 03:33:23 crc kubenswrapper[4923]: I0224 03:33:23.436395 4923 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 03:33:23 crc kubenswrapper[4923]: I0224 03:33:23.469162 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" event={"ID":"08f21e51-2e83-4b47-b794-1e3a2358381d","Type":"ContainerStarted","Data":"a9806dde6b8f67dd58c34ea48cb6af89defaa06c7865f8a6af31ffeba0026e0e"} Feb 24 03:33:24 crc kubenswrapper[4923]: I0224 03:33:24.478214 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" event={"ID":"08f21e51-2e83-4b47-b794-1e3a2358381d","Type":"ContainerStarted","Data":"c6a465d260f5d89c81de7d6695f34e0d0bef710e26c56ac2cf30f3acc3dc2cb6"} Feb 24 03:33:24 crc kubenswrapper[4923]: I0224 03:33:24.502167 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" podStartSLOduration=1.88204386 podStartE2EDuration="2.502149224s" podCreationTimestamp="2026-02-24 03:33:22 +0000 UTC" firstStartedPulling="2026-02-24 03:33:23.435891186 +0000 UTC m=+2327.452962029" lastFinishedPulling="2026-02-24 03:33:24.05599657 +0000 UTC m=+2328.073067393" observedRunningTime="2026-02-24 03:33:24.494138364 +0000 UTC m=+2328.511209187" watchObservedRunningTime="2026-02-24 03:33:24.502149224 +0000 UTC m=+2328.519220037" Feb 24 03:35:19 crc kubenswrapper[4923]: I0224 03:35:19.916573 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:35:19 crc kubenswrapper[4923]: I0224 03:35:19.917356 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:35:49 crc kubenswrapper[4923]: I0224 03:35:49.916850 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:35:49 crc kubenswrapper[4923]: I0224 03:35:49.917438 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:35:56 crc kubenswrapper[4923]: I0224 03:35:56.093367 4923 generic.go:334] "Generic (PLEG): container finished" podID="08f21e51-2e83-4b47-b794-1e3a2358381d" containerID="c6a465d260f5d89c81de7d6695f34e0d0bef710e26c56ac2cf30f3acc3dc2cb6" exitCode=0 Feb 24 03:35:56 crc kubenswrapper[4923]: I0224 03:35:56.093463 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" event={"ID":"08f21e51-2e83-4b47-b794-1e3a2358381d","Type":"ContainerDied","Data":"c6a465d260f5d89c81de7d6695f34e0d0bef710e26c56ac2cf30f3acc3dc2cb6"} Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.550380 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.628345 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-extra-config-0\") pod \"08f21e51-2e83-4b47-b794-1e3a2358381d\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.628864 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-combined-ca-bundle\") pod \"08f21e51-2e83-4b47-b794-1e3a2358381d\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.628965 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-inventory\") pod \"08f21e51-2e83-4b47-b794-1e3a2358381d\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.629044 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-migration-ssh-key-0\") pod \"08f21e51-2e83-4b47-b794-1e3a2358381d\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.629099 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-0\") pod \"08f21e51-2e83-4b47-b794-1e3a2358381d\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.629154 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-2\") pod \"08f21e51-2e83-4b47-b794-1e3a2358381d\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.629215 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn2bs\" (UniqueName: \"kubernetes.io/projected/08f21e51-2e83-4b47-b794-1e3a2358381d-kube-api-access-pn2bs\") pod \"08f21e51-2e83-4b47-b794-1e3a2358381d\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.629285 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-migration-ssh-key-1\") pod \"08f21e51-2e83-4b47-b794-1e3a2358381d\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.630681 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-1\") pod \"08f21e51-2e83-4b47-b794-1e3a2358381d\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.630793 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-ssh-key-openstack-edpm-ipam\") pod \"08f21e51-2e83-4b47-b794-1e3a2358381d\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.630852 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-3\") pod \"08f21e51-2e83-4b47-b794-1e3a2358381d\" (UID: \"08f21e51-2e83-4b47-b794-1e3a2358381d\") " Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.633613 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "08f21e51-2e83-4b47-b794-1e3a2358381d" (UID: "08f21e51-2e83-4b47-b794-1e3a2358381d"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.656530 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08f21e51-2e83-4b47-b794-1e3a2358381d-kube-api-access-pn2bs" (OuterVolumeSpecName: "kube-api-access-pn2bs") pod "08f21e51-2e83-4b47-b794-1e3a2358381d" (UID: "08f21e51-2e83-4b47-b794-1e3a2358381d"). InnerVolumeSpecName "kube-api-access-pn2bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.658399 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "08f21e51-2e83-4b47-b794-1e3a2358381d" (UID: "08f21e51-2e83-4b47-b794-1e3a2358381d"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.660191 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "08f21e51-2e83-4b47-b794-1e3a2358381d" (UID: "08f21e51-2e83-4b47-b794-1e3a2358381d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.660500 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "08f21e51-2e83-4b47-b794-1e3a2358381d" (UID: "08f21e51-2e83-4b47-b794-1e3a2358381d"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.662226 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-inventory" (OuterVolumeSpecName: "inventory") pod "08f21e51-2e83-4b47-b794-1e3a2358381d" (UID: "08f21e51-2e83-4b47-b794-1e3a2358381d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.673276 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "08f21e51-2e83-4b47-b794-1e3a2358381d" (UID: "08f21e51-2e83-4b47-b794-1e3a2358381d"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.673692 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "08f21e51-2e83-4b47-b794-1e3a2358381d" (UID: "08f21e51-2e83-4b47-b794-1e3a2358381d"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.680175 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "08f21e51-2e83-4b47-b794-1e3a2358381d" (UID: "08f21e51-2e83-4b47-b794-1e3a2358381d"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.686772 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "08f21e51-2e83-4b47-b794-1e3a2358381d" (UID: "08f21e51-2e83-4b47-b794-1e3a2358381d"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.692998 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "08f21e51-2e83-4b47-b794-1e3a2358381d" (UID: "08f21e51-2e83-4b47-b794-1e3a2358381d"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.733678 4923 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.733781 4923 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-inventory\") on node \"crc\" DevicePath \"\"" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.733848 4923 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.733908 4923 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.733967 4923 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.734059 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn2bs\" (UniqueName: \"kubernetes.io/projected/08f21e51-2e83-4b47-b794-1e3a2358381d-kube-api-access-pn2bs\") on node \"crc\" DevicePath \"\"" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.734127 4923 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.734345 4923 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.734452 4923 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.734533 4923 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 24 03:35:57 crc kubenswrapper[4923]: I0224 03:35:57.734790 4923 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/08f21e51-2e83-4b47-b794-1e3a2358381d-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.117625 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" event={"ID":"08f21e51-2e83-4b47-b794-1e3a2358381d","Type":"ContainerDied","Data":"a9806dde6b8f67dd58c34ea48cb6af89defaa06c7865f8a6af31ffeba0026e0e"} Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.117887 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9806dde6b8f67dd58c34ea48cb6af89defaa06c7865f8a6af31ffeba0026e0e" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.117742 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rfz2h" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.230469 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8"] Feb 24 03:35:58 crc kubenswrapper[4923]: E0224 03:35:58.230929 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f21e51-2e83-4b47-b794-1e3a2358381d" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.230949 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f21e51-2e83-4b47-b794-1e3a2358381d" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.231166 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="08f21e51-2e83-4b47-b794-1e3a2358381d" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.231895 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.237311 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.238451 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.239021 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.239473 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.241392 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fgpt8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.249199 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8"] Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.351149 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.352050 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.352216 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.352361 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.352483 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.352570 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-882dv\" (UniqueName: \"kubernetes.io/projected/d8be4d6b-1c52-43ae-addf-ad44faf403f2-kube-api-access-882dv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.352649 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.454153 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.454229 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.454285 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.454326 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-882dv\" (UniqueName: \"kubernetes.io/projected/d8be4d6b-1c52-43ae-addf-ad44faf403f2-kube-api-access-882dv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.454357 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.454420 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.454451 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.460883 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.461149 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.463018 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.463072 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.463335 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.463526 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.480836 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-882dv\" (UniqueName: \"kubernetes.io/projected/d8be4d6b-1c52-43ae-addf-ad44faf403f2-kube-api-access-882dv\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:58 crc kubenswrapper[4923]: I0224 03:35:58.591387 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:35:59 crc kubenswrapper[4923]: I0224 03:35:59.106710 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8"] Feb 24 03:35:59 crc kubenswrapper[4923]: W0224 03:35:59.112052 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8be4d6b_1c52_43ae_addf_ad44faf403f2.slice/crio-6b7b6aae9e5eae2fad2c331b64699fb67736aaf19d8165981a630c96e4e3b1e1 WatchSource:0}: Error finding container 6b7b6aae9e5eae2fad2c331b64699fb67736aaf19d8165981a630c96e4e3b1e1: Status 404 returned error can't find the container with id 6b7b6aae9e5eae2fad2c331b64699fb67736aaf19d8165981a630c96e4e3b1e1 Feb 24 03:35:59 crc kubenswrapper[4923]: I0224 03:35:59.127603 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" event={"ID":"d8be4d6b-1c52-43ae-addf-ad44faf403f2","Type":"ContainerStarted","Data":"6b7b6aae9e5eae2fad2c331b64699fb67736aaf19d8165981a630c96e4e3b1e1"} Feb 24 03:36:00 crc kubenswrapper[4923]: I0224 03:36:00.146618 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" event={"ID":"d8be4d6b-1c52-43ae-addf-ad44faf403f2","Type":"ContainerStarted","Data":"ae17b4af6dc5e6ae6d8636b19f1cf486e5493be7b8f37b48e67114390eeb2b16"} Feb 24 03:36:00 crc kubenswrapper[4923]: I0224 03:36:00.178732 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" podStartSLOduration=1.739800426 podStartE2EDuration="2.178711343s" podCreationTimestamp="2026-02-24 03:35:58 +0000 UTC" firstStartedPulling="2026-02-24 03:35:59.114975258 +0000 UTC m=+2483.132046071" lastFinishedPulling="2026-02-24 03:35:59.553886145 +0000 UTC m=+2483.570956988" observedRunningTime="2026-02-24 03:36:00.171909436 +0000 UTC m=+2484.188980309" watchObservedRunningTime="2026-02-24 03:36:00.178711343 +0000 UTC m=+2484.195782176" Feb 24 03:36:19 crc kubenswrapper[4923]: I0224 03:36:19.916388 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:36:19 crc kubenswrapper[4923]: I0224 03:36:19.916841 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:36:19 crc kubenswrapper[4923]: I0224 03:36:19.916881 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 03:36:19 crc kubenswrapper[4923]: I0224 03:36:19.917425 4923 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656"} pod="openshift-machine-config-operator/machine-config-daemon-rh26t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 03:36:19 crc kubenswrapper[4923]: I0224 03:36:19.917480 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" containerID="cri-o://2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" gracePeriod=600 Feb 24 03:36:20 crc kubenswrapper[4923]: E0224 03:36:20.045588 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:36:20 crc kubenswrapper[4923]: I0224 03:36:20.353984 4923 generic.go:334] "Generic (PLEG): container finished" podID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" exitCode=0 Feb 24 03:36:20 crc kubenswrapper[4923]: I0224 03:36:20.354073 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerDied","Data":"2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656"} Feb 24 03:36:20 crc kubenswrapper[4923]: I0224 03:36:20.354189 4923 scope.go:117] "RemoveContainer" containerID="960084f70366f4433330b126ef123c279f963df8e18f6dfbc42d695fdb9ed015" Feb 24 03:36:20 crc kubenswrapper[4923]: I0224 03:36:20.355379 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:36:20 crc kubenswrapper[4923]: E0224 03:36:20.355897 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:36:33 crc kubenswrapper[4923]: I0224 03:36:33.713155 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:36:33 crc kubenswrapper[4923]: E0224 03:36:33.714000 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:36:44 crc kubenswrapper[4923]: I0224 03:36:44.713053 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:36:44 crc kubenswrapper[4923]: E0224 03:36:44.713894 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:36:58 crc kubenswrapper[4923]: I0224 03:36:58.713663 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:36:58 crc kubenswrapper[4923]: E0224 03:36:58.714381 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:37:09 crc kubenswrapper[4923]: I0224 03:37:09.713563 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:37:09 crc kubenswrapper[4923]: E0224 03:37:09.714511 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:37:23 crc kubenswrapper[4923]: I0224 03:37:23.712880 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:37:23 crc kubenswrapper[4923]: E0224 03:37:23.713692 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:37:26 crc kubenswrapper[4923]: I0224 03:37:26.196446 4923 patch_prober.go:28] interesting pod/router-default-5444994796-ctnr7 container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 03:37:26 crc kubenswrapper[4923]: I0224 03:37:26.196779 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-ctnr7" podUID="10a4e000-d2a9-455f-a7a7-ae4d90611c29" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 24 03:37:35 crc kubenswrapper[4923]: I0224 03:37:35.713950 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:37:35 crc kubenswrapper[4923]: E0224 03:37:35.715156 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:37:48 crc kubenswrapper[4923]: I0224 03:37:48.713605 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:37:48 crc kubenswrapper[4923]: E0224 03:37:48.714528 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:38:03 crc kubenswrapper[4923]: I0224 03:38:03.714602 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:38:03 crc kubenswrapper[4923]: E0224 03:38:03.716158 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:38:17 crc kubenswrapper[4923]: I0224 03:38:17.721867 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:38:17 crc kubenswrapper[4923]: E0224 03:38:17.722783 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:38:32 crc kubenswrapper[4923]: I0224 03:38:32.713013 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:38:32 crc kubenswrapper[4923]: E0224 03:38:32.713618 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:38:34 crc kubenswrapper[4923]: I0224 03:38:34.039777 4923 generic.go:334] "Generic (PLEG): container finished" podID="d8be4d6b-1c52-43ae-addf-ad44faf403f2" containerID="ae17b4af6dc5e6ae6d8636b19f1cf486e5493be7b8f37b48e67114390eeb2b16" exitCode=0 Feb 24 03:38:34 crc kubenswrapper[4923]: I0224 03:38:34.039885 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" event={"ID":"d8be4d6b-1c52-43ae-addf-ad44faf403f2","Type":"ContainerDied","Data":"ae17b4af6dc5e6ae6d8636b19f1cf486e5493be7b8f37b48e67114390eeb2b16"} Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.508853 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.570534 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-inventory\") pod \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.570591 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ceilometer-compute-config-data-0\") pod \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.595638 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "d8be4d6b-1c52-43ae-addf-ad44faf403f2" (UID: "d8be4d6b-1c52-43ae-addf-ad44faf403f2"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.598982 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-inventory" (OuterVolumeSpecName: "inventory") pod "d8be4d6b-1c52-43ae-addf-ad44faf403f2" (UID: "d8be4d6b-1c52-43ae-addf-ad44faf403f2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.671811 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-telemetry-combined-ca-bundle\") pod \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.672345 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ceilometer-compute-config-data-1\") pod \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.672510 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ceilometer-compute-config-data-2\") pod \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.672628 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-882dv\" (UniqueName: \"kubernetes.io/projected/d8be4d6b-1c52-43ae-addf-ad44faf403f2-kube-api-access-882dv\") pod \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.672715 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ssh-key-openstack-edpm-ipam\") pod \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\" (UID: \"d8be4d6b-1c52-43ae-addf-ad44faf403f2\") " Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.673368 4923 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-inventory\") on node \"crc\" DevicePath \"\"" Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.673476 4923 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.677232 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8be4d6b-1c52-43ae-addf-ad44faf403f2-kube-api-access-882dv" (OuterVolumeSpecName: "kube-api-access-882dv") pod "d8be4d6b-1c52-43ae-addf-ad44faf403f2" (UID: "d8be4d6b-1c52-43ae-addf-ad44faf403f2"). InnerVolumeSpecName "kube-api-access-882dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.677933 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d8be4d6b-1c52-43ae-addf-ad44faf403f2" (UID: "d8be4d6b-1c52-43ae-addf-ad44faf403f2"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.702347 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "d8be4d6b-1c52-43ae-addf-ad44faf403f2" (UID: "d8be4d6b-1c52-43ae-addf-ad44faf403f2"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.703317 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "d8be4d6b-1c52-43ae-addf-ad44faf403f2" (UID: "d8be4d6b-1c52-43ae-addf-ad44faf403f2"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.710011 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d8be4d6b-1c52-43ae-addf-ad44faf403f2" (UID: "d8be4d6b-1c52-43ae-addf-ad44faf403f2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.774958 4923 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.775064 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-882dv\" (UniqueName: \"kubernetes.io/projected/d8be4d6b-1c52-43ae-addf-ad44faf403f2-kube-api-access-882dv\") on node \"crc\" DevicePath \"\"" Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.775086 4923 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.775140 4923 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 03:38:35 crc kubenswrapper[4923]: I0224 03:38:35.775161 4923 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d8be4d6b-1c52-43ae-addf-ad44faf403f2-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 24 03:38:36 crc kubenswrapper[4923]: I0224 03:38:36.061535 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" event={"ID":"d8be4d6b-1c52-43ae-addf-ad44faf403f2","Type":"ContainerDied","Data":"6b7b6aae9e5eae2fad2c331b64699fb67736aaf19d8165981a630c96e4e3b1e1"} Feb 24 03:38:36 crc kubenswrapper[4923]: I0224 03:38:36.061582 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b7b6aae9e5eae2fad2c331b64699fb67736aaf19d8165981a630c96e4e3b1e1" Feb 24 03:38:36 crc kubenswrapper[4923]: I0224 03:38:36.061693 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8" Feb 24 03:38:43 crc kubenswrapper[4923]: I0224 03:38:43.017810 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-95ln7"] Feb 24 03:38:43 crc kubenswrapper[4923]: E0224 03:38:43.018809 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8be4d6b-1c52-43ae-addf-ad44faf403f2" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 24 03:38:43 crc kubenswrapper[4923]: I0224 03:38:43.018824 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8be4d6b-1c52-43ae-addf-ad44faf403f2" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 24 03:38:43 crc kubenswrapper[4923]: I0224 03:38:43.019028 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8be4d6b-1c52-43ae-addf-ad44faf403f2" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 24 03:38:43 crc kubenswrapper[4923]: I0224 03:38:43.020518 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95ln7" Feb 24 03:38:43 crc kubenswrapper[4923]: I0224 03:38:43.027529 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4833fd35-eb50-430c-b7bc-2e179d1bf41c-utilities\") pod \"redhat-operators-95ln7\" (UID: \"4833fd35-eb50-430c-b7bc-2e179d1bf41c\") " pod="openshift-marketplace/redhat-operators-95ln7" Feb 24 03:38:43 crc kubenswrapper[4923]: I0224 03:38:43.027647 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4833fd35-eb50-430c-b7bc-2e179d1bf41c-catalog-content\") pod \"redhat-operators-95ln7\" (UID: \"4833fd35-eb50-430c-b7bc-2e179d1bf41c\") " pod="openshift-marketplace/redhat-operators-95ln7" Feb 24 03:38:43 crc kubenswrapper[4923]: I0224 03:38:43.027755 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvsb8\" (UniqueName: \"kubernetes.io/projected/4833fd35-eb50-430c-b7bc-2e179d1bf41c-kube-api-access-rvsb8\") pod \"redhat-operators-95ln7\" (UID: \"4833fd35-eb50-430c-b7bc-2e179d1bf41c\") " pod="openshift-marketplace/redhat-operators-95ln7" Feb 24 03:38:43 crc kubenswrapper[4923]: I0224 03:38:43.059717 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-95ln7"] Feb 24 03:38:43 crc kubenswrapper[4923]: I0224 03:38:43.128870 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4833fd35-eb50-430c-b7bc-2e179d1bf41c-catalog-content\") pod \"redhat-operators-95ln7\" (UID: \"4833fd35-eb50-430c-b7bc-2e179d1bf41c\") " pod="openshift-marketplace/redhat-operators-95ln7" Feb 24 03:38:43 crc kubenswrapper[4923]: I0224 03:38:43.128930 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvsb8\" (UniqueName: \"kubernetes.io/projected/4833fd35-eb50-430c-b7bc-2e179d1bf41c-kube-api-access-rvsb8\") pod \"redhat-operators-95ln7\" (UID: \"4833fd35-eb50-430c-b7bc-2e179d1bf41c\") " pod="openshift-marketplace/redhat-operators-95ln7" Feb 24 03:38:43 crc kubenswrapper[4923]: I0224 03:38:43.129763 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4833fd35-eb50-430c-b7bc-2e179d1bf41c-catalog-content\") pod \"redhat-operators-95ln7\" (UID: \"4833fd35-eb50-430c-b7bc-2e179d1bf41c\") " pod="openshift-marketplace/redhat-operators-95ln7" Feb 24 03:38:43 crc kubenswrapper[4923]: I0224 03:38:43.130602 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4833fd35-eb50-430c-b7bc-2e179d1bf41c-utilities\") pod \"redhat-operators-95ln7\" (UID: \"4833fd35-eb50-430c-b7bc-2e179d1bf41c\") " pod="openshift-marketplace/redhat-operators-95ln7" Feb 24 03:38:43 crc kubenswrapper[4923]: I0224 03:38:43.130850 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4833fd35-eb50-430c-b7bc-2e179d1bf41c-utilities\") pod \"redhat-operators-95ln7\" (UID: \"4833fd35-eb50-430c-b7bc-2e179d1bf41c\") " pod="openshift-marketplace/redhat-operators-95ln7" Feb 24 03:38:43 crc kubenswrapper[4923]: I0224 03:38:43.148716 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvsb8\" (UniqueName: \"kubernetes.io/projected/4833fd35-eb50-430c-b7bc-2e179d1bf41c-kube-api-access-rvsb8\") pod \"redhat-operators-95ln7\" (UID: \"4833fd35-eb50-430c-b7bc-2e179d1bf41c\") " pod="openshift-marketplace/redhat-operators-95ln7" Feb 24 03:38:43 crc kubenswrapper[4923]: I0224 03:38:43.352158 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95ln7" Feb 24 03:38:43 crc kubenswrapper[4923]: I0224 03:38:43.798663 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-95ln7"] Feb 24 03:38:44 crc kubenswrapper[4923]: I0224 03:38:44.154482 4923 generic.go:334] "Generic (PLEG): container finished" podID="4833fd35-eb50-430c-b7bc-2e179d1bf41c" containerID="fa6c127389be0cb6ce696a9bcd0c7f7ca9033326f19f785caec6edca4234410a" exitCode=0 Feb 24 03:38:44 crc kubenswrapper[4923]: I0224 03:38:44.154565 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95ln7" event={"ID":"4833fd35-eb50-430c-b7bc-2e179d1bf41c","Type":"ContainerDied","Data":"fa6c127389be0cb6ce696a9bcd0c7f7ca9033326f19f785caec6edca4234410a"} Feb 24 03:38:44 crc kubenswrapper[4923]: I0224 03:38:44.154598 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95ln7" event={"ID":"4833fd35-eb50-430c-b7bc-2e179d1bf41c","Type":"ContainerStarted","Data":"36dc6f007ba3bf157745e289e9374817ca22bc692f01f99ee12f1e7c43041a56"} Feb 24 03:38:44 crc kubenswrapper[4923]: I0224 03:38:44.156026 4923 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 03:38:45 crc kubenswrapper[4923]: I0224 03:38:45.169939 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95ln7" event={"ID":"4833fd35-eb50-430c-b7bc-2e179d1bf41c","Type":"ContainerStarted","Data":"1c2b29fece31efbd9328c87a5c44a41c65eb4642224d719c276f19967a3fd29e"} Feb 24 03:38:46 crc kubenswrapper[4923]: I0224 03:38:46.714121 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:38:46 crc kubenswrapper[4923]: E0224 03:38:46.715044 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:38:48 crc kubenswrapper[4923]: I0224 03:38:48.193174 4923 generic.go:334] "Generic (PLEG): container finished" podID="4833fd35-eb50-430c-b7bc-2e179d1bf41c" containerID="1c2b29fece31efbd9328c87a5c44a41c65eb4642224d719c276f19967a3fd29e" exitCode=0 Feb 24 03:38:48 crc kubenswrapper[4923]: I0224 03:38:48.193217 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95ln7" event={"ID":"4833fd35-eb50-430c-b7bc-2e179d1bf41c","Type":"ContainerDied","Data":"1c2b29fece31efbd9328c87a5c44a41c65eb4642224d719c276f19967a3fd29e"} Feb 24 03:38:49 crc kubenswrapper[4923]: I0224 03:38:49.204193 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95ln7" event={"ID":"4833fd35-eb50-430c-b7bc-2e179d1bf41c","Type":"ContainerStarted","Data":"8683a95502153bb7d484f44d45b36e1b989e611d2250b3cf1169695f4966fad5"} Feb 24 03:38:49 crc kubenswrapper[4923]: I0224 03:38:49.233056 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-95ln7" podStartSLOduration=2.81238224 podStartE2EDuration="7.233028235s" podCreationTimestamp="2026-02-24 03:38:42 +0000 UTC" firstStartedPulling="2026-02-24 03:38:44.155791893 +0000 UTC m=+2648.172862706" lastFinishedPulling="2026-02-24 03:38:48.576437898 +0000 UTC m=+2652.593508701" observedRunningTime="2026-02-24 03:38:49.224960274 +0000 UTC m=+2653.242031087" watchObservedRunningTime="2026-02-24 03:38:49.233028235 +0000 UTC m=+2653.250099078" Feb 24 03:38:53 crc kubenswrapper[4923]: I0224 03:38:53.353158 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-95ln7" Feb 24 03:38:53 crc kubenswrapper[4923]: I0224 03:38:53.353847 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-95ln7" Feb 24 03:38:54 crc kubenswrapper[4923]: I0224 03:38:54.438113 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-95ln7" podUID="4833fd35-eb50-430c-b7bc-2e179d1bf41c" containerName="registry-server" probeResult="failure" output=< Feb 24 03:38:54 crc kubenswrapper[4923]: timeout: failed to connect service ":50051" within 1s Feb 24 03:38:54 crc kubenswrapper[4923]: > Feb 24 03:38:55 crc kubenswrapper[4923]: I0224 03:38:55.313606 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wsqgq"] Feb 24 03:38:55 crc kubenswrapper[4923]: I0224 03:38:55.318136 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wsqgq" Feb 24 03:38:55 crc kubenswrapper[4923]: I0224 03:38:55.339019 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wsqgq"] Feb 24 03:38:55 crc kubenswrapper[4923]: I0224 03:38:55.401187 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2dd9af9-2a22-4a54-b370-05203a3e70d0-catalog-content\") pod \"redhat-marketplace-wsqgq\" (UID: \"e2dd9af9-2a22-4a54-b370-05203a3e70d0\") " pod="openshift-marketplace/redhat-marketplace-wsqgq" Feb 24 03:38:55 crc kubenswrapper[4923]: I0224 03:38:55.401416 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2dd9af9-2a22-4a54-b370-05203a3e70d0-utilities\") pod \"redhat-marketplace-wsqgq\" (UID: \"e2dd9af9-2a22-4a54-b370-05203a3e70d0\") " pod="openshift-marketplace/redhat-marketplace-wsqgq" Feb 24 03:38:55 crc kubenswrapper[4923]: I0224 03:38:55.401619 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzm8j\" (UniqueName: \"kubernetes.io/projected/e2dd9af9-2a22-4a54-b370-05203a3e70d0-kube-api-access-rzm8j\") pod \"redhat-marketplace-wsqgq\" (UID: \"e2dd9af9-2a22-4a54-b370-05203a3e70d0\") " pod="openshift-marketplace/redhat-marketplace-wsqgq" Feb 24 03:38:55 crc kubenswrapper[4923]: I0224 03:38:55.504473 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2dd9af9-2a22-4a54-b370-05203a3e70d0-catalog-content\") pod \"redhat-marketplace-wsqgq\" (UID: \"e2dd9af9-2a22-4a54-b370-05203a3e70d0\") " pod="openshift-marketplace/redhat-marketplace-wsqgq" Feb 24 03:38:55 crc kubenswrapper[4923]: I0224 03:38:55.504631 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2dd9af9-2a22-4a54-b370-05203a3e70d0-utilities\") pod \"redhat-marketplace-wsqgq\" (UID: \"e2dd9af9-2a22-4a54-b370-05203a3e70d0\") " pod="openshift-marketplace/redhat-marketplace-wsqgq" Feb 24 03:38:55 crc kubenswrapper[4923]: I0224 03:38:55.504715 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzm8j\" (UniqueName: \"kubernetes.io/projected/e2dd9af9-2a22-4a54-b370-05203a3e70d0-kube-api-access-rzm8j\") pod \"redhat-marketplace-wsqgq\" (UID: \"e2dd9af9-2a22-4a54-b370-05203a3e70d0\") " pod="openshift-marketplace/redhat-marketplace-wsqgq" Feb 24 03:38:55 crc kubenswrapper[4923]: I0224 03:38:55.504958 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2dd9af9-2a22-4a54-b370-05203a3e70d0-catalog-content\") pod \"redhat-marketplace-wsqgq\" (UID: \"e2dd9af9-2a22-4a54-b370-05203a3e70d0\") " pod="openshift-marketplace/redhat-marketplace-wsqgq" Feb 24 03:38:55 crc kubenswrapper[4923]: I0224 03:38:55.505368 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2dd9af9-2a22-4a54-b370-05203a3e70d0-utilities\") pod \"redhat-marketplace-wsqgq\" (UID: \"e2dd9af9-2a22-4a54-b370-05203a3e70d0\") " pod="openshift-marketplace/redhat-marketplace-wsqgq" Feb 24 03:38:55 crc kubenswrapper[4923]: I0224 03:38:55.525964 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzm8j\" (UniqueName: \"kubernetes.io/projected/e2dd9af9-2a22-4a54-b370-05203a3e70d0-kube-api-access-rzm8j\") pod \"redhat-marketplace-wsqgq\" (UID: \"e2dd9af9-2a22-4a54-b370-05203a3e70d0\") " pod="openshift-marketplace/redhat-marketplace-wsqgq" Feb 24 03:38:55 crc kubenswrapper[4923]: I0224 03:38:55.647182 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wsqgq" Feb 24 03:38:56 crc kubenswrapper[4923]: I0224 03:38:56.166672 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wsqgq"] Feb 24 03:38:56 crc kubenswrapper[4923]: I0224 03:38:56.281525 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wsqgq" event={"ID":"e2dd9af9-2a22-4a54-b370-05203a3e70d0","Type":"ContainerStarted","Data":"64d0de9e1d77c22b9b2437fd006e0647e15eef4e9babe07ea2a02d233c8b4232"} Feb 24 03:38:57 crc kubenswrapper[4923]: I0224 03:38:57.290379 4923 generic.go:334] "Generic (PLEG): container finished" podID="e2dd9af9-2a22-4a54-b370-05203a3e70d0" containerID="0b1b52d53d64be3a43be3d22b5e97c1f4fabede3edca02f0a3616bf20e9dda75" exitCode=0 Feb 24 03:38:57 crc kubenswrapper[4923]: I0224 03:38:57.290478 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wsqgq" event={"ID":"e2dd9af9-2a22-4a54-b370-05203a3e70d0","Type":"ContainerDied","Data":"0b1b52d53d64be3a43be3d22b5e97c1f4fabede3edca02f0a3616bf20e9dda75"} Feb 24 03:38:58 crc kubenswrapper[4923]: I0224 03:38:58.305507 4923 generic.go:334] "Generic (PLEG): container finished" podID="e2dd9af9-2a22-4a54-b370-05203a3e70d0" containerID="29f14940045f6790f0f09028e8c5d3f57bb6bb43e8ce8d52f1c256f3808d1378" exitCode=0 Feb 24 03:38:58 crc kubenswrapper[4923]: I0224 03:38:58.305639 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wsqgq" event={"ID":"e2dd9af9-2a22-4a54-b370-05203a3e70d0","Type":"ContainerDied","Data":"29f14940045f6790f0f09028e8c5d3f57bb6bb43e8ce8d52f1c256f3808d1378"} Feb 24 03:38:59 crc kubenswrapper[4923]: I0224 03:38:59.317549 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wsqgq" event={"ID":"e2dd9af9-2a22-4a54-b370-05203a3e70d0","Type":"ContainerStarted","Data":"091f10d003088cf278aa042f0bf4c2d66a63c14be8444a30db31ad0136d94ff3"} Feb 24 03:38:59 crc kubenswrapper[4923]: I0224 03:38:59.340563 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wsqgq" podStartSLOduration=2.92684155 podStartE2EDuration="4.340540581s" podCreationTimestamp="2026-02-24 03:38:55 +0000 UTC" firstStartedPulling="2026-02-24 03:38:57.292254039 +0000 UTC m=+2661.309324852" lastFinishedPulling="2026-02-24 03:38:58.70595307 +0000 UTC m=+2662.723023883" observedRunningTime="2026-02-24 03:38:59.333160428 +0000 UTC m=+2663.350231261" watchObservedRunningTime="2026-02-24 03:38:59.340540581 +0000 UTC m=+2663.357611404" Feb 24 03:38:59 crc kubenswrapper[4923]: I0224 03:38:59.681594 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4wrp9"] Feb 24 03:38:59 crc kubenswrapper[4923]: I0224 03:38:59.683398 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4wrp9" Feb 24 03:38:59 crc kubenswrapper[4923]: I0224 03:38:59.707164 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4wrp9"] Feb 24 03:38:59 crc kubenswrapper[4923]: I0224 03:38:59.713246 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:38:59 crc kubenswrapper[4923]: E0224 03:38:59.713660 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:38:59 crc kubenswrapper[4923]: I0224 03:38:59.803775 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmzjr\" (UniqueName: \"kubernetes.io/projected/2419463d-6462-426c-b238-aaa487c884e2-kube-api-access-mmzjr\") pod \"community-operators-4wrp9\" (UID: \"2419463d-6462-426c-b238-aaa487c884e2\") " pod="openshift-marketplace/community-operators-4wrp9" Feb 24 03:38:59 crc kubenswrapper[4923]: I0224 03:38:59.803860 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2419463d-6462-426c-b238-aaa487c884e2-catalog-content\") pod \"community-operators-4wrp9\" (UID: \"2419463d-6462-426c-b238-aaa487c884e2\") " pod="openshift-marketplace/community-operators-4wrp9" Feb 24 03:38:59 crc kubenswrapper[4923]: I0224 03:38:59.804196 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2419463d-6462-426c-b238-aaa487c884e2-utilities\") pod \"community-operators-4wrp9\" (UID: \"2419463d-6462-426c-b238-aaa487c884e2\") " pod="openshift-marketplace/community-operators-4wrp9" Feb 24 03:38:59 crc kubenswrapper[4923]: I0224 03:38:59.906127 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2419463d-6462-426c-b238-aaa487c884e2-catalog-content\") pod \"community-operators-4wrp9\" (UID: \"2419463d-6462-426c-b238-aaa487c884e2\") " pod="openshift-marketplace/community-operators-4wrp9" Feb 24 03:38:59 crc kubenswrapper[4923]: I0224 03:38:59.906230 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2419463d-6462-426c-b238-aaa487c884e2-utilities\") pod \"community-operators-4wrp9\" (UID: \"2419463d-6462-426c-b238-aaa487c884e2\") " pod="openshift-marketplace/community-operators-4wrp9" Feb 24 03:38:59 crc kubenswrapper[4923]: I0224 03:38:59.906336 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmzjr\" (UniqueName: \"kubernetes.io/projected/2419463d-6462-426c-b238-aaa487c884e2-kube-api-access-mmzjr\") pod \"community-operators-4wrp9\" (UID: \"2419463d-6462-426c-b238-aaa487c884e2\") " pod="openshift-marketplace/community-operators-4wrp9" Feb 24 03:38:59 crc kubenswrapper[4923]: I0224 03:38:59.907375 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2419463d-6462-426c-b238-aaa487c884e2-utilities\") pod \"community-operators-4wrp9\" (UID: \"2419463d-6462-426c-b238-aaa487c884e2\") " pod="openshift-marketplace/community-operators-4wrp9" Feb 24 03:38:59 crc kubenswrapper[4923]: I0224 03:38:59.907553 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2419463d-6462-426c-b238-aaa487c884e2-catalog-content\") pod \"community-operators-4wrp9\" (UID: \"2419463d-6462-426c-b238-aaa487c884e2\") " pod="openshift-marketplace/community-operators-4wrp9" Feb 24 03:38:59 crc kubenswrapper[4923]: I0224 03:38:59.939866 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmzjr\" (UniqueName: \"kubernetes.io/projected/2419463d-6462-426c-b238-aaa487c884e2-kube-api-access-mmzjr\") pod \"community-operators-4wrp9\" (UID: \"2419463d-6462-426c-b238-aaa487c884e2\") " pod="openshift-marketplace/community-operators-4wrp9" Feb 24 03:39:00 crc kubenswrapper[4923]: I0224 03:39:00.000440 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4wrp9" Feb 24 03:39:00 crc kubenswrapper[4923]: I0224 03:39:00.561274 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4wrp9"] Feb 24 03:39:00 crc kubenswrapper[4923]: W0224 03:39:00.572466 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2419463d_6462_426c_b238_aaa487c884e2.slice/crio-67571331f34499da95a4290c3ef795ef74136599218dcf081e00bb193405ee87 WatchSource:0}: Error finding container 67571331f34499da95a4290c3ef795ef74136599218dcf081e00bb193405ee87: Status 404 returned error can't find the container with id 67571331f34499da95a4290c3ef795ef74136599218dcf081e00bb193405ee87 Feb 24 03:39:01 crc kubenswrapper[4923]: I0224 03:39:01.337857 4923 generic.go:334] "Generic (PLEG): container finished" podID="2419463d-6462-426c-b238-aaa487c884e2" containerID="69db2ba2a2a4677c4a84069e6f4b8d48bd531545bc32e9b03584b9aaf2eae1b7" exitCode=0 Feb 24 03:39:01 crc kubenswrapper[4923]: I0224 03:39:01.338089 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wrp9" event={"ID":"2419463d-6462-426c-b238-aaa487c884e2","Type":"ContainerDied","Data":"69db2ba2a2a4677c4a84069e6f4b8d48bd531545bc32e9b03584b9aaf2eae1b7"} Feb 24 03:39:01 crc kubenswrapper[4923]: I0224 03:39:01.338638 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wrp9" event={"ID":"2419463d-6462-426c-b238-aaa487c884e2","Type":"ContainerStarted","Data":"67571331f34499da95a4290c3ef795ef74136599218dcf081e00bb193405ee87"} Feb 24 03:39:02 crc kubenswrapper[4923]: I0224 03:39:02.357818 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wrp9" event={"ID":"2419463d-6462-426c-b238-aaa487c884e2","Type":"ContainerStarted","Data":"6efe769937990157199a17de2d38da49ffb9d1acc103cffcbeed7ac33fcfc5e0"} Feb 24 03:39:03 crc kubenswrapper[4923]: I0224 03:39:03.373763 4923 generic.go:334] "Generic (PLEG): container finished" podID="2419463d-6462-426c-b238-aaa487c884e2" containerID="6efe769937990157199a17de2d38da49ffb9d1acc103cffcbeed7ac33fcfc5e0" exitCode=0 Feb 24 03:39:03 crc kubenswrapper[4923]: I0224 03:39:03.373822 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wrp9" event={"ID":"2419463d-6462-426c-b238-aaa487c884e2","Type":"ContainerDied","Data":"6efe769937990157199a17de2d38da49ffb9d1acc103cffcbeed7ac33fcfc5e0"} Feb 24 03:39:03 crc kubenswrapper[4923]: I0224 03:39:03.374125 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wrp9" event={"ID":"2419463d-6462-426c-b238-aaa487c884e2","Type":"ContainerStarted","Data":"7c4ab2e897df97a3785962e0537e58c7490deeac37123f149cfd241482a78ed3"} Feb 24 03:39:03 crc kubenswrapper[4923]: I0224 03:39:03.398533 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4wrp9" podStartSLOduration=3.002577159 podStartE2EDuration="4.398513657s" podCreationTimestamp="2026-02-24 03:38:59 +0000 UTC" firstStartedPulling="2026-02-24 03:39:01.339764392 +0000 UTC m=+2665.356835215" lastFinishedPulling="2026-02-24 03:39:02.73570086 +0000 UTC m=+2666.752771713" observedRunningTime="2026-02-24 03:39:03.393715841 +0000 UTC m=+2667.410786644" watchObservedRunningTime="2026-02-24 03:39:03.398513657 +0000 UTC m=+2667.415584470" Feb 24 03:39:03 crc kubenswrapper[4923]: I0224 03:39:03.425437 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-95ln7" Feb 24 03:39:03 crc kubenswrapper[4923]: I0224 03:39:03.476570 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-95ln7" Feb 24 03:39:05 crc kubenswrapper[4923]: I0224 03:39:05.648316 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wsqgq" Feb 24 03:39:05 crc kubenswrapper[4923]: I0224 03:39:05.648678 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wsqgq" Feb 24 03:39:05 crc kubenswrapper[4923]: I0224 03:39:05.683891 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-95ln7"] Feb 24 03:39:05 crc kubenswrapper[4923]: I0224 03:39:05.684270 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-95ln7" podUID="4833fd35-eb50-430c-b7bc-2e179d1bf41c" containerName="registry-server" containerID="cri-o://8683a95502153bb7d484f44d45b36e1b989e611d2250b3cf1169695f4966fad5" gracePeriod=2 Feb 24 03:39:05 crc kubenswrapper[4923]: I0224 03:39:05.742419 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wsqgq" Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.138338 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95ln7" Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.237709 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4833fd35-eb50-430c-b7bc-2e179d1bf41c-utilities\") pod \"4833fd35-eb50-430c-b7bc-2e179d1bf41c\" (UID: \"4833fd35-eb50-430c-b7bc-2e179d1bf41c\") " Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.238158 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvsb8\" (UniqueName: \"kubernetes.io/projected/4833fd35-eb50-430c-b7bc-2e179d1bf41c-kube-api-access-rvsb8\") pod \"4833fd35-eb50-430c-b7bc-2e179d1bf41c\" (UID: \"4833fd35-eb50-430c-b7bc-2e179d1bf41c\") " Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.238323 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4833fd35-eb50-430c-b7bc-2e179d1bf41c-catalog-content\") pod \"4833fd35-eb50-430c-b7bc-2e179d1bf41c\" (UID: \"4833fd35-eb50-430c-b7bc-2e179d1bf41c\") " Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.240381 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4833fd35-eb50-430c-b7bc-2e179d1bf41c-utilities" (OuterVolumeSpecName: "utilities") pod "4833fd35-eb50-430c-b7bc-2e179d1bf41c" (UID: "4833fd35-eb50-430c-b7bc-2e179d1bf41c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.244226 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4833fd35-eb50-430c-b7bc-2e179d1bf41c-kube-api-access-rvsb8" (OuterVolumeSpecName: "kube-api-access-rvsb8") pod "4833fd35-eb50-430c-b7bc-2e179d1bf41c" (UID: "4833fd35-eb50-430c-b7bc-2e179d1bf41c"). InnerVolumeSpecName "kube-api-access-rvsb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.340976 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4833fd35-eb50-430c-b7bc-2e179d1bf41c-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.341030 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvsb8\" (UniqueName: \"kubernetes.io/projected/4833fd35-eb50-430c-b7bc-2e179d1bf41c-kube-api-access-rvsb8\") on node \"crc\" DevicePath \"\"" Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.357481 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4833fd35-eb50-430c-b7bc-2e179d1bf41c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4833fd35-eb50-430c-b7bc-2e179d1bf41c" (UID: "4833fd35-eb50-430c-b7bc-2e179d1bf41c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.412200 4923 generic.go:334] "Generic (PLEG): container finished" podID="4833fd35-eb50-430c-b7bc-2e179d1bf41c" containerID="8683a95502153bb7d484f44d45b36e1b989e611d2250b3cf1169695f4966fad5" exitCode=0 Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.412236 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95ln7" event={"ID":"4833fd35-eb50-430c-b7bc-2e179d1bf41c","Type":"ContainerDied","Data":"8683a95502153bb7d484f44d45b36e1b989e611d2250b3cf1169695f4966fad5"} Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.412265 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95ln7" Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.412282 4923 scope.go:117] "RemoveContainer" containerID="8683a95502153bb7d484f44d45b36e1b989e611d2250b3cf1169695f4966fad5" Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.412270 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95ln7" event={"ID":"4833fd35-eb50-430c-b7bc-2e179d1bf41c","Type":"ContainerDied","Data":"36dc6f007ba3bf157745e289e9374817ca22bc692f01f99ee12f1e7c43041a56"} Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.437595 4923 scope.go:117] "RemoveContainer" containerID="1c2b29fece31efbd9328c87a5c44a41c65eb4642224d719c276f19967a3fd29e" Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.443842 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4833fd35-eb50-430c-b7bc-2e179d1bf41c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.448631 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-95ln7"] Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.460281 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-95ln7"] Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.471205 4923 scope.go:117] "RemoveContainer" containerID="fa6c127389be0cb6ce696a9bcd0c7f7ca9033326f19f785caec6edca4234410a" Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.471546 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wsqgq" Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.510737 4923 scope.go:117] "RemoveContainer" containerID="8683a95502153bb7d484f44d45b36e1b989e611d2250b3cf1169695f4966fad5" Feb 24 03:39:06 crc kubenswrapper[4923]: E0224 03:39:06.511127 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8683a95502153bb7d484f44d45b36e1b989e611d2250b3cf1169695f4966fad5\": container with ID starting with 8683a95502153bb7d484f44d45b36e1b989e611d2250b3cf1169695f4966fad5 not found: ID does not exist" containerID="8683a95502153bb7d484f44d45b36e1b989e611d2250b3cf1169695f4966fad5" Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.511166 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8683a95502153bb7d484f44d45b36e1b989e611d2250b3cf1169695f4966fad5"} err="failed to get container status \"8683a95502153bb7d484f44d45b36e1b989e611d2250b3cf1169695f4966fad5\": rpc error: code = NotFound desc = could not find container \"8683a95502153bb7d484f44d45b36e1b989e611d2250b3cf1169695f4966fad5\": container with ID starting with 8683a95502153bb7d484f44d45b36e1b989e611d2250b3cf1169695f4966fad5 not found: ID does not exist" Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.511195 4923 scope.go:117] "RemoveContainer" containerID="1c2b29fece31efbd9328c87a5c44a41c65eb4642224d719c276f19967a3fd29e" Feb 24 03:39:06 crc kubenswrapper[4923]: E0224 03:39:06.511616 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c2b29fece31efbd9328c87a5c44a41c65eb4642224d719c276f19967a3fd29e\": container with ID starting with 1c2b29fece31efbd9328c87a5c44a41c65eb4642224d719c276f19967a3fd29e not found: ID does not exist" containerID="1c2b29fece31efbd9328c87a5c44a41c65eb4642224d719c276f19967a3fd29e" Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.511640 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c2b29fece31efbd9328c87a5c44a41c65eb4642224d719c276f19967a3fd29e"} err="failed to get container status \"1c2b29fece31efbd9328c87a5c44a41c65eb4642224d719c276f19967a3fd29e\": rpc error: code = NotFound desc = could not find container \"1c2b29fece31efbd9328c87a5c44a41c65eb4642224d719c276f19967a3fd29e\": container with ID starting with 1c2b29fece31efbd9328c87a5c44a41c65eb4642224d719c276f19967a3fd29e not found: ID does not exist" Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.511654 4923 scope.go:117] "RemoveContainer" containerID="fa6c127389be0cb6ce696a9bcd0c7f7ca9033326f19f785caec6edca4234410a" Feb 24 03:39:06 crc kubenswrapper[4923]: E0224 03:39:06.511903 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa6c127389be0cb6ce696a9bcd0c7f7ca9033326f19f785caec6edca4234410a\": container with ID starting with fa6c127389be0cb6ce696a9bcd0c7f7ca9033326f19f785caec6edca4234410a not found: ID does not exist" containerID="fa6c127389be0cb6ce696a9bcd0c7f7ca9033326f19f785caec6edca4234410a" Feb 24 03:39:06 crc kubenswrapper[4923]: I0224 03:39:06.511934 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa6c127389be0cb6ce696a9bcd0c7f7ca9033326f19f785caec6edca4234410a"} err="failed to get container status \"fa6c127389be0cb6ce696a9bcd0c7f7ca9033326f19f785caec6edca4234410a\": rpc error: code = NotFound desc = could not find container \"fa6c127389be0cb6ce696a9bcd0c7f7ca9033326f19f785caec6edca4234410a\": container with ID starting with fa6c127389be0cb6ce696a9bcd0c7f7ca9033326f19f785caec6edca4234410a not found: ID does not exist" Feb 24 03:39:07 crc kubenswrapper[4923]: I0224 03:39:07.750985 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4833fd35-eb50-430c-b7bc-2e179d1bf41c" path="/var/lib/kubelet/pods/4833fd35-eb50-430c-b7bc-2e179d1bf41c/volumes" Feb 24 03:39:08 crc kubenswrapper[4923]: I0224 03:39:08.270446 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wsqgq"] Feb 24 03:39:08 crc kubenswrapper[4923]: I0224 03:39:08.429635 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wsqgq" podUID="e2dd9af9-2a22-4a54-b370-05203a3e70d0" containerName="registry-server" containerID="cri-o://091f10d003088cf278aa042f0bf4c2d66a63c14be8444a30db31ad0136d94ff3" gracePeriod=2 Feb 24 03:39:08 crc kubenswrapper[4923]: E0224 03:39:08.673667 4923 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.194:44646->38.102.83.194:35219: read tcp 38.102.83.194:44646->38.102.83.194:35219: read: connection reset by peer Feb 24 03:39:08 crc kubenswrapper[4923]: I0224 03:39:08.922901 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wsqgq" Feb 24 03:39:08 crc kubenswrapper[4923]: I0224 03:39:08.995985 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzm8j\" (UniqueName: \"kubernetes.io/projected/e2dd9af9-2a22-4a54-b370-05203a3e70d0-kube-api-access-rzm8j\") pod \"e2dd9af9-2a22-4a54-b370-05203a3e70d0\" (UID: \"e2dd9af9-2a22-4a54-b370-05203a3e70d0\") " Feb 24 03:39:08 crc kubenswrapper[4923]: I0224 03:39:08.996117 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2dd9af9-2a22-4a54-b370-05203a3e70d0-catalog-content\") pod \"e2dd9af9-2a22-4a54-b370-05203a3e70d0\" (UID: \"e2dd9af9-2a22-4a54-b370-05203a3e70d0\") " Feb 24 03:39:08 crc kubenswrapper[4923]: I0224 03:39:08.996212 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2dd9af9-2a22-4a54-b370-05203a3e70d0-utilities\") pod \"e2dd9af9-2a22-4a54-b370-05203a3e70d0\" (UID: \"e2dd9af9-2a22-4a54-b370-05203a3e70d0\") " Feb 24 03:39:08 crc kubenswrapper[4923]: I0224 03:39:08.998056 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2dd9af9-2a22-4a54-b370-05203a3e70d0-utilities" (OuterVolumeSpecName: "utilities") pod "e2dd9af9-2a22-4a54-b370-05203a3e70d0" (UID: "e2dd9af9-2a22-4a54-b370-05203a3e70d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.004166 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2dd9af9-2a22-4a54-b370-05203a3e70d0-kube-api-access-rzm8j" (OuterVolumeSpecName: "kube-api-access-rzm8j") pod "e2dd9af9-2a22-4a54-b370-05203a3e70d0" (UID: "e2dd9af9-2a22-4a54-b370-05203a3e70d0"). InnerVolumeSpecName "kube-api-access-rzm8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.035754 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2dd9af9-2a22-4a54-b370-05203a3e70d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2dd9af9-2a22-4a54-b370-05203a3e70d0" (UID: "e2dd9af9-2a22-4a54-b370-05203a3e70d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.098040 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2dd9af9-2a22-4a54-b370-05203a3e70d0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.098072 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2dd9af9-2a22-4a54-b370-05203a3e70d0-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.098082 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzm8j\" (UniqueName: \"kubernetes.io/projected/e2dd9af9-2a22-4a54-b370-05203a3e70d0-kube-api-access-rzm8j\") on node \"crc\" DevicePath \"\"" Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.444898 4923 generic.go:334] "Generic (PLEG): container finished" podID="e2dd9af9-2a22-4a54-b370-05203a3e70d0" containerID="091f10d003088cf278aa042f0bf4c2d66a63c14be8444a30db31ad0136d94ff3" exitCode=0 Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.444985 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wsqgq" event={"ID":"e2dd9af9-2a22-4a54-b370-05203a3e70d0","Type":"ContainerDied","Data":"091f10d003088cf278aa042f0bf4c2d66a63c14be8444a30db31ad0136d94ff3"} Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.445244 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wsqgq" event={"ID":"e2dd9af9-2a22-4a54-b370-05203a3e70d0","Type":"ContainerDied","Data":"64d0de9e1d77c22b9b2437fd006e0647e15eef4e9babe07ea2a02d233c8b4232"} Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.445049 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wsqgq" Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.445271 4923 scope.go:117] "RemoveContainer" containerID="091f10d003088cf278aa042f0bf4c2d66a63c14be8444a30db31ad0136d94ff3" Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.486201 4923 scope.go:117] "RemoveContainer" containerID="29f14940045f6790f0f09028e8c5d3f57bb6bb43e8ce8d52f1c256f3808d1378" Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.489723 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wsqgq"] Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.498208 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wsqgq"] Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.522763 4923 scope.go:117] "RemoveContainer" containerID="0b1b52d53d64be3a43be3d22b5e97c1f4fabede3edca02f0a3616bf20e9dda75" Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.575763 4923 scope.go:117] "RemoveContainer" containerID="091f10d003088cf278aa042f0bf4c2d66a63c14be8444a30db31ad0136d94ff3" Feb 24 03:39:09 crc kubenswrapper[4923]: E0224 03:39:09.576285 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"091f10d003088cf278aa042f0bf4c2d66a63c14be8444a30db31ad0136d94ff3\": container with ID starting with 091f10d003088cf278aa042f0bf4c2d66a63c14be8444a30db31ad0136d94ff3 not found: ID does not exist" containerID="091f10d003088cf278aa042f0bf4c2d66a63c14be8444a30db31ad0136d94ff3" Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.576354 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"091f10d003088cf278aa042f0bf4c2d66a63c14be8444a30db31ad0136d94ff3"} err="failed to get container status \"091f10d003088cf278aa042f0bf4c2d66a63c14be8444a30db31ad0136d94ff3\": rpc error: code = NotFound desc = could not find container \"091f10d003088cf278aa042f0bf4c2d66a63c14be8444a30db31ad0136d94ff3\": container with ID starting with 091f10d003088cf278aa042f0bf4c2d66a63c14be8444a30db31ad0136d94ff3 not found: ID does not exist" Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.576393 4923 scope.go:117] "RemoveContainer" containerID="29f14940045f6790f0f09028e8c5d3f57bb6bb43e8ce8d52f1c256f3808d1378" Feb 24 03:39:09 crc kubenswrapper[4923]: E0224 03:39:09.576868 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29f14940045f6790f0f09028e8c5d3f57bb6bb43e8ce8d52f1c256f3808d1378\": container with ID starting with 29f14940045f6790f0f09028e8c5d3f57bb6bb43e8ce8d52f1c256f3808d1378 not found: ID does not exist" containerID="29f14940045f6790f0f09028e8c5d3f57bb6bb43e8ce8d52f1c256f3808d1378" Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.576903 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29f14940045f6790f0f09028e8c5d3f57bb6bb43e8ce8d52f1c256f3808d1378"} err="failed to get container status \"29f14940045f6790f0f09028e8c5d3f57bb6bb43e8ce8d52f1c256f3808d1378\": rpc error: code = NotFound desc = could not find container \"29f14940045f6790f0f09028e8c5d3f57bb6bb43e8ce8d52f1c256f3808d1378\": container with ID starting with 29f14940045f6790f0f09028e8c5d3f57bb6bb43e8ce8d52f1c256f3808d1378 not found: ID does not exist" Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.576921 4923 scope.go:117] "RemoveContainer" containerID="0b1b52d53d64be3a43be3d22b5e97c1f4fabede3edca02f0a3616bf20e9dda75" Feb 24 03:39:09 crc kubenswrapper[4923]: E0224 03:39:09.577240 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b1b52d53d64be3a43be3d22b5e97c1f4fabede3edca02f0a3616bf20e9dda75\": container with ID starting with 0b1b52d53d64be3a43be3d22b5e97c1f4fabede3edca02f0a3616bf20e9dda75 not found: ID does not exist" containerID="0b1b52d53d64be3a43be3d22b5e97c1f4fabede3edca02f0a3616bf20e9dda75" Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.577271 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1b52d53d64be3a43be3d22b5e97c1f4fabede3edca02f0a3616bf20e9dda75"} err="failed to get container status \"0b1b52d53d64be3a43be3d22b5e97c1f4fabede3edca02f0a3616bf20e9dda75\": rpc error: code = NotFound desc = could not find container \"0b1b52d53d64be3a43be3d22b5e97c1f4fabede3edca02f0a3616bf20e9dda75\": container with ID starting with 0b1b52d53d64be3a43be3d22b5e97c1f4fabede3edca02f0a3616bf20e9dda75 not found: ID does not exist" Feb 24 03:39:09 crc kubenswrapper[4923]: I0224 03:39:09.730903 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2dd9af9-2a22-4a54-b370-05203a3e70d0" path="/var/lib/kubelet/pods/e2dd9af9-2a22-4a54-b370-05203a3e70d0/volumes" Feb 24 03:39:10 crc kubenswrapper[4923]: I0224 03:39:10.001671 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4wrp9" Feb 24 03:39:10 crc kubenswrapper[4923]: I0224 03:39:10.001736 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4wrp9" Feb 24 03:39:10 crc kubenswrapper[4923]: I0224 03:39:10.070941 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4wrp9" Feb 24 03:39:10 crc kubenswrapper[4923]: I0224 03:39:10.504443 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4wrp9" Feb 24 03:39:11 crc kubenswrapper[4923]: I0224 03:39:11.713275 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:39:11 crc kubenswrapper[4923]: E0224 03:39:11.713844 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:39:12 crc kubenswrapper[4923]: I0224 03:39:12.275750 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4wrp9"] Feb 24 03:39:12 crc kubenswrapper[4923]: I0224 03:39:12.472097 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4wrp9" podUID="2419463d-6462-426c-b238-aaa487c884e2" containerName="registry-server" containerID="cri-o://7c4ab2e897df97a3785962e0537e58c7490deeac37123f149cfd241482a78ed3" gracePeriod=2 Feb 24 03:39:12 crc kubenswrapper[4923]: I0224 03:39:12.947548 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4wrp9" Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.075802 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2419463d-6462-426c-b238-aaa487c884e2-catalog-content\") pod \"2419463d-6462-426c-b238-aaa487c884e2\" (UID: \"2419463d-6462-426c-b238-aaa487c884e2\") " Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.076373 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2419463d-6462-426c-b238-aaa487c884e2-utilities\") pod \"2419463d-6462-426c-b238-aaa487c884e2\" (UID: \"2419463d-6462-426c-b238-aaa487c884e2\") " Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.076502 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmzjr\" (UniqueName: \"kubernetes.io/projected/2419463d-6462-426c-b238-aaa487c884e2-kube-api-access-mmzjr\") pod \"2419463d-6462-426c-b238-aaa487c884e2\" (UID: \"2419463d-6462-426c-b238-aaa487c884e2\") " Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.077288 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2419463d-6462-426c-b238-aaa487c884e2-utilities" (OuterVolumeSpecName: "utilities") pod "2419463d-6462-426c-b238-aaa487c884e2" (UID: "2419463d-6462-426c-b238-aaa487c884e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.086835 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2419463d-6462-426c-b238-aaa487c884e2-kube-api-access-mmzjr" (OuterVolumeSpecName: "kube-api-access-mmzjr") pod "2419463d-6462-426c-b238-aaa487c884e2" (UID: "2419463d-6462-426c-b238-aaa487c884e2"). InnerVolumeSpecName "kube-api-access-mmzjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.136037 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2419463d-6462-426c-b238-aaa487c884e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2419463d-6462-426c-b238-aaa487c884e2" (UID: "2419463d-6462-426c-b238-aaa487c884e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.178420 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2419463d-6462-426c-b238-aaa487c884e2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.178648 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2419463d-6462-426c-b238-aaa487c884e2-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.178743 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmzjr\" (UniqueName: \"kubernetes.io/projected/2419463d-6462-426c-b238-aaa487c884e2-kube-api-access-mmzjr\") on node \"crc\" DevicePath \"\"" Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.486818 4923 generic.go:334] "Generic (PLEG): container finished" podID="2419463d-6462-426c-b238-aaa487c884e2" containerID="7c4ab2e897df97a3785962e0537e58c7490deeac37123f149cfd241482a78ed3" exitCode=0 Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.486880 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wrp9" event={"ID":"2419463d-6462-426c-b238-aaa487c884e2","Type":"ContainerDied","Data":"7c4ab2e897df97a3785962e0537e58c7490deeac37123f149cfd241482a78ed3"} Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.486932 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wrp9" event={"ID":"2419463d-6462-426c-b238-aaa487c884e2","Type":"ContainerDied","Data":"67571331f34499da95a4290c3ef795ef74136599218dcf081e00bb193405ee87"} Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.486930 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4wrp9" Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.486952 4923 scope.go:117] "RemoveContainer" containerID="7c4ab2e897df97a3785962e0537e58c7490deeac37123f149cfd241482a78ed3" Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.512905 4923 scope.go:117] "RemoveContainer" containerID="6efe769937990157199a17de2d38da49ffb9d1acc103cffcbeed7ac33fcfc5e0" Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.531956 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4wrp9"] Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.539787 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4wrp9"] Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.561872 4923 scope.go:117] "RemoveContainer" containerID="69db2ba2a2a4677c4a84069e6f4b8d48bd531545bc32e9b03584b9aaf2eae1b7" Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.616024 4923 scope.go:117] "RemoveContainer" containerID="7c4ab2e897df97a3785962e0537e58c7490deeac37123f149cfd241482a78ed3" Feb 24 03:39:13 crc kubenswrapper[4923]: E0224 03:39:13.616598 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c4ab2e897df97a3785962e0537e58c7490deeac37123f149cfd241482a78ed3\": container with ID starting with 7c4ab2e897df97a3785962e0537e58c7490deeac37123f149cfd241482a78ed3 not found: ID does not exist" containerID="7c4ab2e897df97a3785962e0537e58c7490deeac37123f149cfd241482a78ed3" Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.616649 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c4ab2e897df97a3785962e0537e58c7490deeac37123f149cfd241482a78ed3"} err="failed to get container status \"7c4ab2e897df97a3785962e0537e58c7490deeac37123f149cfd241482a78ed3\": rpc error: code = NotFound desc = could not find container \"7c4ab2e897df97a3785962e0537e58c7490deeac37123f149cfd241482a78ed3\": container with ID starting with 7c4ab2e897df97a3785962e0537e58c7490deeac37123f149cfd241482a78ed3 not found: ID does not exist" Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.616683 4923 scope.go:117] "RemoveContainer" containerID="6efe769937990157199a17de2d38da49ffb9d1acc103cffcbeed7ac33fcfc5e0" Feb 24 03:39:13 crc kubenswrapper[4923]: E0224 03:39:13.617033 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6efe769937990157199a17de2d38da49ffb9d1acc103cffcbeed7ac33fcfc5e0\": container with ID starting with 6efe769937990157199a17de2d38da49ffb9d1acc103cffcbeed7ac33fcfc5e0 not found: ID does not exist" containerID="6efe769937990157199a17de2d38da49ffb9d1acc103cffcbeed7ac33fcfc5e0" Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.617065 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6efe769937990157199a17de2d38da49ffb9d1acc103cffcbeed7ac33fcfc5e0"} err="failed to get container status \"6efe769937990157199a17de2d38da49ffb9d1acc103cffcbeed7ac33fcfc5e0\": rpc error: code = NotFound desc = could not find container \"6efe769937990157199a17de2d38da49ffb9d1acc103cffcbeed7ac33fcfc5e0\": container with ID starting with 6efe769937990157199a17de2d38da49ffb9d1acc103cffcbeed7ac33fcfc5e0 not found: ID does not exist" Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.617085 4923 scope.go:117] "RemoveContainer" containerID="69db2ba2a2a4677c4a84069e6f4b8d48bd531545bc32e9b03584b9aaf2eae1b7" Feb 24 03:39:13 crc kubenswrapper[4923]: E0224 03:39:13.617577 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69db2ba2a2a4677c4a84069e6f4b8d48bd531545bc32e9b03584b9aaf2eae1b7\": container with ID starting with 69db2ba2a2a4677c4a84069e6f4b8d48bd531545bc32e9b03584b9aaf2eae1b7 not found: ID does not exist" containerID="69db2ba2a2a4677c4a84069e6f4b8d48bd531545bc32e9b03584b9aaf2eae1b7" Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.617607 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69db2ba2a2a4677c4a84069e6f4b8d48bd531545bc32e9b03584b9aaf2eae1b7"} err="failed to get container status \"69db2ba2a2a4677c4a84069e6f4b8d48bd531545bc32e9b03584b9aaf2eae1b7\": rpc error: code = NotFound desc = could not find container \"69db2ba2a2a4677c4a84069e6f4b8d48bd531545bc32e9b03584b9aaf2eae1b7\": container with ID starting with 69db2ba2a2a4677c4a84069e6f4b8d48bd531545bc32e9b03584b9aaf2eae1b7 not found: ID does not exist" Feb 24 03:39:13 crc kubenswrapper[4923]: I0224 03:39:13.724446 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2419463d-6462-426c-b238-aaa487c884e2" path="/var/lib/kubelet/pods/2419463d-6462-426c-b238-aaa487c884e2/volumes" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.365360 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 24 03:39:20 crc kubenswrapper[4923]: E0224 03:39:20.366587 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2dd9af9-2a22-4a54-b370-05203a3e70d0" containerName="extract-utilities" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.366605 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2dd9af9-2a22-4a54-b370-05203a3e70d0" containerName="extract-utilities" Feb 24 03:39:20 crc kubenswrapper[4923]: E0224 03:39:20.366638 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4833fd35-eb50-430c-b7bc-2e179d1bf41c" containerName="extract-utilities" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.366646 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4833fd35-eb50-430c-b7bc-2e179d1bf41c" containerName="extract-utilities" Feb 24 03:39:20 crc kubenswrapper[4923]: E0224 03:39:20.366662 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2419463d-6462-426c-b238-aaa487c884e2" containerName="extract-content" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.366672 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="2419463d-6462-426c-b238-aaa487c884e2" containerName="extract-content" Feb 24 03:39:20 crc kubenswrapper[4923]: E0224 03:39:20.366688 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2dd9af9-2a22-4a54-b370-05203a3e70d0" containerName="registry-server" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.366697 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2dd9af9-2a22-4a54-b370-05203a3e70d0" containerName="registry-server" Feb 24 03:39:20 crc kubenswrapper[4923]: E0224 03:39:20.366716 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4833fd35-eb50-430c-b7bc-2e179d1bf41c" containerName="registry-server" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.366726 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4833fd35-eb50-430c-b7bc-2e179d1bf41c" containerName="registry-server" Feb 24 03:39:20 crc kubenswrapper[4923]: E0224 03:39:20.366746 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2419463d-6462-426c-b238-aaa487c884e2" containerName="extract-utilities" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.366754 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="2419463d-6462-426c-b238-aaa487c884e2" containerName="extract-utilities" Feb 24 03:39:20 crc kubenswrapper[4923]: E0224 03:39:20.366778 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2419463d-6462-426c-b238-aaa487c884e2" containerName="registry-server" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.366787 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="2419463d-6462-426c-b238-aaa487c884e2" containerName="registry-server" Feb 24 03:39:20 crc kubenswrapper[4923]: E0224 03:39:20.366802 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4833fd35-eb50-430c-b7bc-2e179d1bf41c" containerName="extract-content" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.366811 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4833fd35-eb50-430c-b7bc-2e179d1bf41c" containerName="extract-content" Feb 24 03:39:20 crc kubenswrapper[4923]: E0224 03:39:20.366826 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2dd9af9-2a22-4a54-b370-05203a3e70d0" containerName="extract-content" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.366834 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2dd9af9-2a22-4a54-b370-05203a3e70d0" containerName="extract-content" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.367042 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="2419463d-6462-426c-b238-aaa487c884e2" containerName="registry-server" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.367058 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2dd9af9-2a22-4a54-b370-05203a3e70d0" containerName="registry-server" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.367079 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="4833fd35-eb50-430c-b7bc-2e179d1bf41c" containerName="registry-server" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.367817 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.371855 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.371885 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-nvb7w" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.372026 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.372212 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.392232 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.525971 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.526037 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.526113 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-config-data\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.526255 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.526404 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.526499 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdpjk\" (UniqueName: \"kubernetes.io/projected/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-kube-api-access-wdpjk\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.526619 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.526693 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.526735 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.629255 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdpjk\" (UniqueName: \"kubernetes.io/projected/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-kube-api-access-wdpjk\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.629426 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.629533 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.629595 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.629655 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.629733 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.629801 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-config-data\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.629873 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.629924 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.630118 4923 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.630174 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.630936 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.631476 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-config-data\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.631881 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.636664 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.637256 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.643692 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.645166 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdpjk\" (UniqueName: \"kubernetes.io/projected/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-kube-api-access-wdpjk\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.662050 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " pod="openstack/tempest-tests-tempest" Feb 24 03:39:20 crc kubenswrapper[4923]: I0224 03:39:20.718269 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 24 03:39:21 crc kubenswrapper[4923]: I0224 03:39:21.026942 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mvpwx"] Feb 24 03:39:21 crc kubenswrapper[4923]: I0224 03:39:21.029166 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvpwx" Feb 24 03:39:21 crc kubenswrapper[4923]: I0224 03:39:21.039240 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mvpwx"] Feb 24 03:39:21 crc kubenswrapper[4923]: I0224 03:39:21.142432 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nsf7\" (UniqueName: \"kubernetes.io/projected/714dca7a-7964-4cb6-a146-19bbb1cc9fe4-kube-api-access-9nsf7\") pod \"certified-operators-mvpwx\" (UID: \"714dca7a-7964-4cb6-a146-19bbb1cc9fe4\") " pod="openshift-marketplace/certified-operators-mvpwx" Feb 24 03:39:21 crc kubenswrapper[4923]: I0224 03:39:21.142506 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/714dca7a-7964-4cb6-a146-19bbb1cc9fe4-catalog-content\") pod \"certified-operators-mvpwx\" (UID: \"714dca7a-7964-4cb6-a146-19bbb1cc9fe4\") " pod="openshift-marketplace/certified-operators-mvpwx" Feb 24 03:39:21 crc kubenswrapper[4923]: I0224 03:39:21.142702 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/714dca7a-7964-4cb6-a146-19bbb1cc9fe4-utilities\") pod \"certified-operators-mvpwx\" (UID: \"714dca7a-7964-4cb6-a146-19bbb1cc9fe4\") " pod="openshift-marketplace/certified-operators-mvpwx" Feb 24 03:39:21 crc kubenswrapper[4923]: I0224 03:39:21.164154 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 24 03:39:21 crc kubenswrapper[4923]: I0224 03:39:21.244193 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/714dca7a-7964-4cb6-a146-19bbb1cc9fe4-utilities\") pod \"certified-operators-mvpwx\" (UID: \"714dca7a-7964-4cb6-a146-19bbb1cc9fe4\") " pod="openshift-marketplace/certified-operators-mvpwx" Feb 24 03:39:21 crc kubenswrapper[4923]: I0224 03:39:21.244632 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/714dca7a-7964-4cb6-a146-19bbb1cc9fe4-utilities\") pod \"certified-operators-mvpwx\" (UID: \"714dca7a-7964-4cb6-a146-19bbb1cc9fe4\") " pod="openshift-marketplace/certified-operators-mvpwx" Feb 24 03:39:21 crc kubenswrapper[4923]: I0224 03:39:21.244764 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nsf7\" (UniqueName: \"kubernetes.io/projected/714dca7a-7964-4cb6-a146-19bbb1cc9fe4-kube-api-access-9nsf7\") pod \"certified-operators-mvpwx\" (UID: \"714dca7a-7964-4cb6-a146-19bbb1cc9fe4\") " pod="openshift-marketplace/certified-operators-mvpwx" Feb 24 03:39:21 crc kubenswrapper[4923]: I0224 03:39:21.244817 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/714dca7a-7964-4cb6-a146-19bbb1cc9fe4-catalog-content\") pod \"certified-operators-mvpwx\" (UID: \"714dca7a-7964-4cb6-a146-19bbb1cc9fe4\") " pod="openshift-marketplace/certified-operators-mvpwx" Feb 24 03:39:21 crc kubenswrapper[4923]: I0224 03:39:21.245044 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/714dca7a-7964-4cb6-a146-19bbb1cc9fe4-catalog-content\") pod \"certified-operators-mvpwx\" (UID: \"714dca7a-7964-4cb6-a146-19bbb1cc9fe4\") " pod="openshift-marketplace/certified-operators-mvpwx" Feb 24 03:39:21 crc kubenswrapper[4923]: I0224 03:39:21.277082 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nsf7\" (UniqueName: \"kubernetes.io/projected/714dca7a-7964-4cb6-a146-19bbb1cc9fe4-kube-api-access-9nsf7\") pod \"certified-operators-mvpwx\" (UID: \"714dca7a-7964-4cb6-a146-19bbb1cc9fe4\") " pod="openshift-marketplace/certified-operators-mvpwx" Feb 24 03:39:21 crc kubenswrapper[4923]: I0224 03:39:21.367106 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvpwx" Feb 24 03:39:21 crc kubenswrapper[4923]: I0224 03:39:21.579051 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761","Type":"ContainerStarted","Data":"06bafcb4f1ea20da89fefd4d5340464e40dc777a10f21a9bdfd370d99f3a5f3d"} Feb 24 03:39:21 crc kubenswrapper[4923]: I0224 03:39:21.903130 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mvpwx"] Feb 24 03:39:22 crc kubenswrapper[4923]: I0224 03:39:22.590547 4923 generic.go:334] "Generic (PLEG): container finished" podID="714dca7a-7964-4cb6-a146-19bbb1cc9fe4" containerID="d0b6fbf30f59ff871614d628657c87b8a04d4e36d887d2f895b923266ff1fa60" exitCode=0 Feb 24 03:39:22 crc kubenswrapper[4923]: I0224 03:39:22.590788 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvpwx" event={"ID":"714dca7a-7964-4cb6-a146-19bbb1cc9fe4","Type":"ContainerDied","Data":"d0b6fbf30f59ff871614d628657c87b8a04d4e36d887d2f895b923266ff1fa60"} Feb 24 03:39:22 crc kubenswrapper[4923]: I0224 03:39:22.590966 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvpwx" event={"ID":"714dca7a-7964-4cb6-a146-19bbb1cc9fe4","Type":"ContainerStarted","Data":"ded06af895bede9c062adacb4ddb26f2eed4092907fe0b8907b862d55ef43033"} Feb 24 03:39:22 crc kubenswrapper[4923]: I0224 03:39:22.712795 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:39:22 crc kubenswrapper[4923]: E0224 03:39:22.713013 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:39:27 crc kubenswrapper[4923]: I0224 03:39:27.648431 4923 generic.go:334] "Generic (PLEG): container finished" podID="714dca7a-7964-4cb6-a146-19bbb1cc9fe4" containerID="c604075367e82132ec23d259ebc66ceefb355c17c5ad0457a9037eeb74d679b6" exitCode=0 Feb 24 03:39:27 crc kubenswrapper[4923]: I0224 03:39:27.648581 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvpwx" event={"ID":"714dca7a-7964-4cb6-a146-19bbb1cc9fe4","Type":"ContainerDied","Data":"c604075367e82132ec23d259ebc66ceefb355c17c5ad0457a9037eeb74d679b6"} Feb 24 03:39:28 crc kubenswrapper[4923]: I0224 03:39:28.662215 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvpwx" event={"ID":"714dca7a-7964-4cb6-a146-19bbb1cc9fe4","Type":"ContainerStarted","Data":"2f9699a904c09575dede7f52207734d3f81c7880efb48372501439d8b9fb2e2f"} Feb 24 03:39:28 crc kubenswrapper[4923]: I0224 03:39:28.691629 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mvpwx" podStartSLOduration=2.266345441 podStartE2EDuration="7.691611139s" podCreationTimestamp="2026-02-24 03:39:21 +0000 UTC" firstStartedPulling="2026-02-24 03:39:22.592919958 +0000 UTC m=+2686.609990771" lastFinishedPulling="2026-02-24 03:39:28.018185646 +0000 UTC m=+2692.035256469" observedRunningTime="2026-02-24 03:39:28.685085209 +0000 UTC m=+2692.702156042" watchObservedRunningTime="2026-02-24 03:39:28.691611139 +0000 UTC m=+2692.708681952" Feb 24 03:39:31 crc kubenswrapper[4923]: I0224 03:39:31.367468 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mvpwx" Feb 24 03:39:31 crc kubenswrapper[4923]: I0224 03:39:31.368110 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mvpwx" Feb 24 03:39:31 crc kubenswrapper[4923]: I0224 03:39:31.414672 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mvpwx" Feb 24 03:39:34 crc kubenswrapper[4923]: I0224 03:39:34.714417 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:39:34 crc kubenswrapper[4923]: E0224 03:39:34.715272 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:39:41 crc kubenswrapper[4923]: I0224 03:39:41.424367 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mvpwx" Feb 24 03:39:41 crc kubenswrapper[4923]: I0224 03:39:41.473922 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mvpwx"] Feb 24 03:39:41 crc kubenswrapper[4923]: I0224 03:39:41.801063 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mvpwx" podUID="714dca7a-7964-4cb6-a146-19bbb1cc9fe4" containerName="registry-server" containerID="cri-o://2f9699a904c09575dede7f52207734d3f81c7880efb48372501439d8b9fb2e2f" gracePeriod=2 Feb 24 03:39:42 crc kubenswrapper[4923]: I0224 03:39:42.811892 4923 generic.go:334] "Generic (PLEG): container finished" podID="714dca7a-7964-4cb6-a146-19bbb1cc9fe4" containerID="2f9699a904c09575dede7f52207734d3f81c7880efb48372501439d8b9fb2e2f" exitCode=0 Feb 24 03:39:42 crc kubenswrapper[4923]: I0224 03:39:42.811983 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvpwx" event={"ID":"714dca7a-7964-4cb6-a146-19bbb1cc9fe4","Type":"ContainerDied","Data":"2f9699a904c09575dede7f52207734d3f81c7880efb48372501439d8b9fb2e2f"} Feb 24 03:39:49 crc kubenswrapper[4923]: I0224 03:39:49.713156 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:39:49 crc kubenswrapper[4923]: E0224 03:39:49.714549 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:39:51 crc kubenswrapper[4923]: E0224 03:39:51.368826 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2f9699a904c09575dede7f52207734d3f81c7880efb48372501439d8b9fb2e2f is running failed: container process not found" containerID="2f9699a904c09575dede7f52207734d3f81c7880efb48372501439d8b9fb2e2f" cmd=["grpc_health_probe","-addr=:50051"] Feb 24 03:39:51 crc kubenswrapper[4923]: E0224 03:39:51.371194 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2f9699a904c09575dede7f52207734d3f81c7880efb48372501439d8b9fb2e2f is running failed: container process not found" containerID="2f9699a904c09575dede7f52207734d3f81c7880efb48372501439d8b9fb2e2f" cmd=["grpc_health_probe","-addr=:50051"] Feb 24 03:39:51 crc kubenswrapper[4923]: E0224 03:39:51.371772 4923 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2f9699a904c09575dede7f52207734d3f81c7880efb48372501439d8b9fb2e2f is running failed: container process not found" containerID="2f9699a904c09575dede7f52207734d3f81c7880efb48372501439d8b9fb2e2f" cmd=["grpc_health_probe","-addr=:50051"] Feb 24 03:39:51 crc kubenswrapper[4923]: E0224 03:39:51.371841 4923 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2f9699a904c09575dede7f52207734d3f81c7880efb48372501439d8b9fb2e2f is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-mvpwx" podUID="714dca7a-7964-4cb6-a146-19bbb1cc9fe4" containerName="registry-server" Feb 24 03:39:58 crc kubenswrapper[4923]: E0224 03:39:58.367759 4923 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 24 03:39:58 crc kubenswrapper[4923]: E0224 03:39:58.369586 4923 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wdpjk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(9b6f2b0b-f8d2-4a36-a1a2-177dcf809761): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 24 03:39:58 crc kubenswrapper[4923]: E0224 03:39:58.370839 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="9b6f2b0b-f8d2-4a36-a1a2-177dcf809761" Feb 24 03:39:58 crc kubenswrapper[4923]: I0224 03:39:58.719345 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvpwx" Feb 24 03:39:58 crc kubenswrapper[4923]: I0224 03:39:58.912418 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nsf7\" (UniqueName: \"kubernetes.io/projected/714dca7a-7964-4cb6-a146-19bbb1cc9fe4-kube-api-access-9nsf7\") pod \"714dca7a-7964-4cb6-a146-19bbb1cc9fe4\" (UID: \"714dca7a-7964-4cb6-a146-19bbb1cc9fe4\") " Feb 24 03:39:58 crc kubenswrapper[4923]: I0224 03:39:58.912563 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/714dca7a-7964-4cb6-a146-19bbb1cc9fe4-utilities\") pod \"714dca7a-7964-4cb6-a146-19bbb1cc9fe4\" (UID: \"714dca7a-7964-4cb6-a146-19bbb1cc9fe4\") " Feb 24 03:39:58 crc kubenswrapper[4923]: I0224 03:39:58.912653 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/714dca7a-7964-4cb6-a146-19bbb1cc9fe4-catalog-content\") pod \"714dca7a-7964-4cb6-a146-19bbb1cc9fe4\" (UID: \"714dca7a-7964-4cb6-a146-19bbb1cc9fe4\") " Feb 24 03:39:58 crc kubenswrapper[4923]: I0224 03:39:58.913359 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/714dca7a-7964-4cb6-a146-19bbb1cc9fe4-utilities" (OuterVolumeSpecName: "utilities") pod "714dca7a-7964-4cb6-a146-19bbb1cc9fe4" (UID: "714dca7a-7964-4cb6-a146-19bbb1cc9fe4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:39:58 crc kubenswrapper[4923]: I0224 03:39:58.925463 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/714dca7a-7964-4cb6-a146-19bbb1cc9fe4-kube-api-access-9nsf7" (OuterVolumeSpecName: "kube-api-access-9nsf7") pod "714dca7a-7964-4cb6-a146-19bbb1cc9fe4" (UID: "714dca7a-7964-4cb6-a146-19bbb1cc9fe4"). InnerVolumeSpecName "kube-api-access-9nsf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:39:58 crc kubenswrapper[4923]: I0224 03:39:58.960114 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/714dca7a-7964-4cb6-a146-19bbb1cc9fe4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "714dca7a-7964-4cb6-a146-19bbb1cc9fe4" (UID: "714dca7a-7964-4cb6-a146-19bbb1cc9fe4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:39:58 crc kubenswrapper[4923]: I0224 03:39:58.964550 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvpwx" Feb 24 03:39:58 crc kubenswrapper[4923]: I0224 03:39:58.964575 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvpwx" event={"ID":"714dca7a-7964-4cb6-a146-19bbb1cc9fe4","Type":"ContainerDied","Data":"ded06af895bede9c062adacb4ddb26f2eed4092907fe0b8907b862d55ef43033"} Feb 24 03:39:58 crc kubenswrapper[4923]: I0224 03:39:58.964652 4923 scope.go:117] "RemoveContainer" containerID="2f9699a904c09575dede7f52207734d3f81c7880efb48372501439d8b9fb2e2f" Feb 24 03:39:58 crc kubenswrapper[4923]: E0224 03:39:58.967062 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="9b6f2b0b-f8d2-4a36-a1a2-177dcf809761" Feb 24 03:39:58 crc kubenswrapper[4923]: I0224 03:39:58.991155 4923 scope.go:117] "RemoveContainer" containerID="c604075367e82132ec23d259ebc66ceefb355c17c5ad0457a9037eeb74d679b6" Feb 24 03:39:59 crc kubenswrapper[4923]: I0224 03:39:59.015027 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/714dca7a-7964-4cb6-a146-19bbb1cc9fe4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:39:59 crc kubenswrapper[4923]: I0224 03:39:59.015058 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nsf7\" (UniqueName: \"kubernetes.io/projected/714dca7a-7964-4cb6-a146-19bbb1cc9fe4-kube-api-access-9nsf7\") on node \"crc\" DevicePath \"\"" Feb 24 03:39:59 crc kubenswrapper[4923]: I0224 03:39:59.015098 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/714dca7a-7964-4cb6-a146-19bbb1cc9fe4-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:39:59 crc kubenswrapper[4923]: I0224 03:39:59.017120 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mvpwx"] Feb 24 03:39:59 crc kubenswrapper[4923]: I0224 03:39:59.017343 4923 scope.go:117] "RemoveContainer" containerID="d0b6fbf30f59ff871614d628657c87b8a04d4e36d887d2f895b923266ff1fa60" Feb 24 03:39:59 crc kubenswrapper[4923]: I0224 03:39:59.031224 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mvpwx"] Feb 24 03:39:59 crc kubenswrapper[4923]: I0224 03:39:59.727710 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="714dca7a-7964-4cb6-a146-19bbb1cc9fe4" path="/var/lib/kubelet/pods/714dca7a-7964-4cb6-a146-19bbb1cc9fe4/volumes" Feb 24 03:40:02 crc kubenswrapper[4923]: I0224 03:40:02.713164 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:40:02 crc kubenswrapper[4923]: E0224 03:40:02.713654 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:40:12 crc kubenswrapper[4923]: I0224 03:40:12.148719 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 24 03:40:13 crc kubenswrapper[4923]: I0224 03:40:13.713148 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:40:13 crc kubenswrapper[4923]: E0224 03:40:13.714036 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:40:14 crc kubenswrapper[4923]: I0224 03:40:14.130758 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761","Type":"ContainerStarted","Data":"bf0a98d09807405cb7c48564281efb8750b3c036c955fab8ced2931f9c0d695b"} Feb 24 03:40:14 crc kubenswrapper[4923]: I0224 03:40:14.149860 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.187888876 podStartE2EDuration="55.149841104s" podCreationTimestamp="2026-02-24 03:39:19 +0000 UTC" firstStartedPulling="2026-02-24 03:39:21.183822467 +0000 UTC m=+2685.200893300" lastFinishedPulling="2026-02-24 03:40:12.145774715 +0000 UTC m=+2736.162845528" observedRunningTime="2026-02-24 03:40:14.14433571 +0000 UTC m=+2738.161406563" watchObservedRunningTime="2026-02-24 03:40:14.149841104 +0000 UTC m=+2738.166911917" Feb 24 03:40:26 crc kubenswrapper[4923]: I0224 03:40:26.713146 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:40:26 crc kubenswrapper[4923]: E0224 03:40:26.713907 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:40:38 crc kubenswrapper[4923]: I0224 03:40:38.713204 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:40:38 crc kubenswrapper[4923]: E0224 03:40:38.714062 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:40:53 crc kubenswrapper[4923]: I0224 03:40:53.714282 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:40:53 crc kubenswrapper[4923]: E0224 03:40:53.715076 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:41:07 crc kubenswrapper[4923]: I0224 03:41:07.718627 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:41:07 crc kubenswrapper[4923]: E0224 03:41:07.719610 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:41:21 crc kubenswrapper[4923]: I0224 03:41:21.712979 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:41:22 crc kubenswrapper[4923]: I0224 03:41:22.808684 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerStarted","Data":"a7d5408f34c531fa819a01d81bc87f0133ac2b5e1f29fa79b4fc1efb026de283"} Feb 24 03:41:30 crc kubenswrapper[4923]: I0224 03:41:30.396767 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-645cdc8bdf-bkt49" podUID="28a2632f-7155-4c9e-9767-fcda3ff0688b" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 24 03:43:49 crc kubenswrapper[4923]: I0224 03:43:49.916057 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:43:49 crc kubenswrapper[4923]: I0224 03:43:49.916756 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:44:19 crc kubenswrapper[4923]: I0224 03:44:19.916731 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:44:19 crc kubenswrapper[4923]: I0224 03:44:19.917531 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:44:49 crc kubenswrapper[4923]: I0224 03:44:49.916740 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:44:49 crc kubenswrapper[4923]: I0224 03:44:49.917379 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:44:49 crc kubenswrapper[4923]: I0224 03:44:49.917426 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 03:44:49 crc kubenswrapper[4923]: I0224 03:44:49.918173 4923 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a7d5408f34c531fa819a01d81bc87f0133ac2b5e1f29fa79b4fc1efb026de283"} pod="openshift-machine-config-operator/machine-config-daemon-rh26t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 03:44:49 crc kubenswrapper[4923]: I0224 03:44:49.918216 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" containerID="cri-o://a7d5408f34c531fa819a01d81bc87f0133ac2b5e1f29fa79b4fc1efb026de283" gracePeriod=600 Feb 24 03:44:50 crc kubenswrapper[4923]: I0224 03:44:50.584609 4923 generic.go:334] "Generic (PLEG): container finished" podID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerID="a7d5408f34c531fa819a01d81bc87f0133ac2b5e1f29fa79b4fc1efb026de283" exitCode=0 Feb 24 03:44:50 crc kubenswrapper[4923]: I0224 03:44:50.585214 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerDied","Data":"a7d5408f34c531fa819a01d81bc87f0133ac2b5e1f29fa79b4fc1efb026de283"} Feb 24 03:44:50 crc kubenswrapper[4923]: I0224 03:44:50.585256 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerStarted","Data":"4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5"} Feb 24 03:44:50 crc kubenswrapper[4923]: I0224 03:44:50.585282 4923 scope.go:117] "RemoveContainer" containerID="2f8c132d71da57170f0964e23b367e1a421e86fe11c822c9ae5e6b77f266e656" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.174825 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj"] Feb 24 03:45:00 crc kubenswrapper[4923]: E0224 03:45:00.175856 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="714dca7a-7964-4cb6-a146-19bbb1cc9fe4" containerName="extract-content" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.175875 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="714dca7a-7964-4cb6-a146-19bbb1cc9fe4" containerName="extract-content" Feb 24 03:45:00 crc kubenswrapper[4923]: E0224 03:45:00.175897 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="714dca7a-7964-4cb6-a146-19bbb1cc9fe4" containerName="registry-server" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.175905 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="714dca7a-7964-4cb6-a146-19bbb1cc9fe4" containerName="registry-server" Feb 24 03:45:00 crc kubenswrapper[4923]: E0224 03:45:00.175925 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="714dca7a-7964-4cb6-a146-19bbb1cc9fe4" containerName="extract-utilities" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.175933 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="714dca7a-7964-4cb6-a146-19bbb1cc9fe4" containerName="extract-utilities" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.176163 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="714dca7a-7964-4cb6-a146-19bbb1cc9fe4" containerName="registry-server" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.176846 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.179805 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.180190 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.192793 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj"] Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.208920 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8644fc11-83a6-4c78-98c9-939ad4e980ba-secret-volume\") pod \"collect-profiles-29531745-x7vjj\" (UID: \"8644fc11-83a6-4c78-98c9-939ad4e980ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.209009 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx545\" (UniqueName: \"kubernetes.io/projected/8644fc11-83a6-4c78-98c9-939ad4e980ba-kube-api-access-fx545\") pod \"collect-profiles-29531745-x7vjj\" (UID: \"8644fc11-83a6-4c78-98c9-939ad4e980ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.209269 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8644fc11-83a6-4c78-98c9-939ad4e980ba-config-volume\") pod \"collect-profiles-29531745-x7vjj\" (UID: \"8644fc11-83a6-4c78-98c9-939ad4e980ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.311203 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8644fc11-83a6-4c78-98c9-939ad4e980ba-config-volume\") pod \"collect-profiles-29531745-x7vjj\" (UID: \"8644fc11-83a6-4c78-98c9-939ad4e980ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.311442 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8644fc11-83a6-4c78-98c9-939ad4e980ba-secret-volume\") pod \"collect-profiles-29531745-x7vjj\" (UID: \"8644fc11-83a6-4c78-98c9-939ad4e980ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.311470 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx545\" (UniqueName: \"kubernetes.io/projected/8644fc11-83a6-4c78-98c9-939ad4e980ba-kube-api-access-fx545\") pod \"collect-profiles-29531745-x7vjj\" (UID: \"8644fc11-83a6-4c78-98c9-939ad4e980ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.312700 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8644fc11-83a6-4c78-98c9-939ad4e980ba-config-volume\") pod \"collect-profiles-29531745-x7vjj\" (UID: \"8644fc11-83a6-4c78-98c9-939ad4e980ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.321235 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8644fc11-83a6-4c78-98c9-939ad4e980ba-secret-volume\") pod \"collect-profiles-29531745-x7vjj\" (UID: \"8644fc11-83a6-4c78-98c9-939ad4e980ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.346655 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx545\" (UniqueName: \"kubernetes.io/projected/8644fc11-83a6-4c78-98c9-939ad4e980ba-kube-api-access-fx545\") pod \"collect-profiles-29531745-x7vjj\" (UID: \"8644fc11-83a6-4c78-98c9-939ad4e980ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.510158 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj" Feb 24 03:45:00 crc kubenswrapper[4923]: I0224 03:45:00.946394 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj"] Feb 24 03:45:01 crc kubenswrapper[4923]: I0224 03:45:01.721654 4923 generic.go:334] "Generic (PLEG): container finished" podID="8644fc11-83a6-4c78-98c9-939ad4e980ba" containerID="f14783520b5abdb71850a55e3a49532587d7f3b1f8c4ac921203df22ae9fa9f3" exitCode=0 Feb 24 03:45:01 crc kubenswrapper[4923]: I0224 03:45:01.723664 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj" event={"ID":"8644fc11-83a6-4c78-98c9-939ad4e980ba","Type":"ContainerDied","Data":"f14783520b5abdb71850a55e3a49532587d7f3b1f8c4ac921203df22ae9fa9f3"} Feb 24 03:45:01 crc kubenswrapper[4923]: I0224 03:45:01.723726 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj" event={"ID":"8644fc11-83a6-4c78-98c9-939ad4e980ba","Type":"ContainerStarted","Data":"06bd5b26fa612b6ca120cb1955b193afd3220188008e011fc0ed9085d9ec11f3"} Feb 24 03:45:03 crc kubenswrapper[4923]: I0224 03:45:03.024860 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj" Feb 24 03:45:03 crc kubenswrapper[4923]: I0224 03:45:03.168253 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8644fc11-83a6-4c78-98c9-939ad4e980ba-config-volume\") pod \"8644fc11-83a6-4c78-98c9-939ad4e980ba\" (UID: \"8644fc11-83a6-4c78-98c9-939ad4e980ba\") " Feb 24 03:45:03 crc kubenswrapper[4923]: I0224 03:45:03.168506 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8644fc11-83a6-4c78-98c9-939ad4e980ba-secret-volume\") pod \"8644fc11-83a6-4c78-98c9-939ad4e980ba\" (UID: \"8644fc11-83a6-4c78-98c9-939ad4e980ba\") " Feb 24 03:45:03 crc kubenswrapper[4923]: I0224 03:45:03.168544 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx545\" (UniqueName: \"kubernetes.io/projected/8644fc11-83a6-4c78-98c9-939ad4e980ba-kube-api-access-fx545\") pod \"8644fc11-83a6-4c78-98c9-939ad4e980ba\" (UID: \"8644fc11-83a6-4c78-98c9-939ad4e980ba\") " Feb 24 03:45:03 crc kubenswrapper[4923]: I0224 03:45:03.169167 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8644fc11-83a6-4c78-98c9-939ad4e980ba-config-volume" (OuterVolumeSpecName: "config-volume") pod "8644fc11-83a6-4c78-98c9-939ad4e980ba" (UID: "8644fc11-83a6-4c78-98c9-939ad4e980ba"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:45:03 crc kubenswrapper[4923]: I0224 03:45:03.174859 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8644fc11-83a6-4c78-98c9-939ad4e980ba-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8644fc11-83a6-4c78-98c9-939ad4e980ba" (UID: "8644fc11-83a6-4c78-98c9-939ad4e980ba"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:45:03 crc kubenswrapper[4923]: I0224 03:45:03.175209 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8644fc11-83a6-4c78-98c9-939ad4e980ba-kube-api-access-fx545" (OuterVolumeSpecName: "kube-api-access-fx545") pod "8644fc11-83a6-4c78-98c9-939ad4e980ba" (UID: "8644fc11-83a6-4c78-98c9-939ad4e980ba"). InnerVolumeSpecName "kube-api-access-fx545". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:45:03 crc kubenswrapper[4923]: I0224 03:45:03.270856 4923 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8644fc11-83a6-4c78-98c9-939ad4e980ba-config-volume\") on node \"crc\" DevicePath \"\"" Feb 24 03:45:03 crc kubenswrapper[4923]: I0224 03:45:03.270892 4923 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8644fc11-83a6-4c78-98c9-939ad4e980ba-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 24 03:45:03 crc kubenswrapper[4923]: I0224 03:45:03.270901 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx545\" (UniqueName: \"kubernetes.io/projected/8644fc11-83a6-4c78-98c9-939ad4e980ba-kube-api-access-fx545\") on node \"crc\" DevicePath \"\"" Feb 24 03:45:03 crc kubenswrapper[4923]: I0224 03:45:03.738756 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj" event={"ID":"8644fc11-83a6-4c78-98c9-939ad4e980ba","Type":"ContainerDied","Data":"06bd5b26fa612b6ca120cb1955b193afd3220188008e011fc0ed9085d9ec11f3"} Feb 24 03:45:03 crc kubenswrapper[4923]: I0224 03:45:03.739022 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06bd5b26fa612b6ca120cb1955b193afd3220188008e011fc0ed9085d9ec11f3" Feb 24 03:45:03 crc kubenswrapper[4923]: I0224 03:45:03.738788 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531745-x7vjj" Feb 24 03:45:04 crc kubenswrapper[4923]: I0224 03:45:04.096627 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv"] Feb 24 03:45:04 crc kubenswrapper[4923]: I0224 03:45:04.104237 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531700-kgcnv"] Feb 24 03:45:05 crc kubenswrapper[4923]: I0224 03:45:05.726002 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d10f5924-e560-4a3f-bb67-e7e59ab5fd75" path="/var/lib/kubelet/pods/d10f5924-e560-4a3f-bb67-e7e59ab5fd75/volumes" Feb 24 03:45:58 crc kubenswrapper[4923]: I0224 03:45:58.480759 4923 scope.go:117] "RemoveContainer" containerID="cb82784025bfa1cf0935dd6150ff02544c3422f10b29939953a0c256cd3cd884" Feb 24 03:47:19 crc kubenswrapper[4923]: I0224 03:47:19.916888 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:47:19 crc kubenswrapper[4923]: I0224 03:47:19.917500 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:47:49 crc kubenswrapper[4923]: I0224 03:47:49.916194 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:47:49 crc kubenswrapper[4923]: I0224 03:47:49.916804 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:48:19 crc kubenswrapper[4923]: I0224 03:48:19.916861 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:48:19 crc kubenswrapper[4923]: I0224 03:48:19.917490 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:48:19 crc kubenswrapper[4923]: I0224 03:48:19.917533 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 03:48:19 crc kubenswrapper[4923]: I0224 03:48:19.918340 4923 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5"} pod="openshift-machine-config-operator/machine-config-daemon-rh26t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 03:48:19 crc kubenswrapper[4923]: I0224 03:48:19.918410 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" containerID="cri-o://4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" gracePeriod=600 Feb 24 03:48:20 crc kubenswrapper[4923]: E0224 03:48:20.059260 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:48:20 crc kubenswrapper[4923]: I0224 03:48:20.657274 4923 generic.go:334] "Generic (PLEG): container finished" podID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" exitCode=0 Feb 24 03:48:20 crc kubenswrapper[4923]: I0224 03:48:20.657338 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerDied","Data":"4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5"} Feb 24 03:48:20 crc kubenswrapper[4923]: I0224 03:48:20.657409 4923 scope.go:117] "RemoveContainer" containerID="a7d5408f34c531fa819a01d81bc87f0133ac2b5e1f29fa79b4fc1efb026de283" Feb 24 03:48:20 crc kubenswrapper[4923]: I0224 03:48:20.658025 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:48:20 crc kubenswrapper[4923]: E0224 03:48:20.658292 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:48:31 crc kubenswrapper[4923]: I0224 03:48:31.713951 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:48:31 crc kubenswrapper[4923]: E0224 03:48:31.715003 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:48:42 crc kubenswrapper[4923]: I0224 03:48:42.714166 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:48:42 crc kubenswrapper[4923]: E0224 03:48:42.715181 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:48:53 crc kubenswrapper[4923]: I0224 03:48:53.714527 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:48:53 crc kubenswrapper[4923]: E0224 03:48:53.715479 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:49:08 crc kubenswrapper[4923]: I0224 03:49:08.713666 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:49:08 crc kubenswrapper[4923]: E0224 03:49:08.714654 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:49:22 crc kubenswrapper[4923]: I0224 03:49:22.713082 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:49:22 crc kubenswrapper[4923]: E0224 03:49:22.713880 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:49:34 crc kubenswrapper[4923]: I0224 03:49:34.713623 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:49:34 crc kubenswrapper[4923]: E0224 03:49:34.714663 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:49:44 crc kubenswrapper[4923]: I0224 03:49:44.423636 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d6pr5"] Feb 24 03:49:44 crc kubenswrapper[4923]: E0224 03:49:44.424813 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8644fc11-83a6-4c78-98c9-939ad4e980ba" containerName="collect-profiles" Feb 24 03:49:44 crc kubenswrapper[4923]: I0224 03:49:44.424832 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="8644fc11-83a6-4c78-98c9-939ad4e980ba" containerName="collect-profiles" Feb 24 03:49:44 crc kubenswrapper[4923]: I0224 03:49:44.425050 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="8644fc11-83a6-4c78-98c9-939ad4e980ba" containerName="collect-profiles" Feb 24 03:49:44 crc kubenswrapper[4923]: I0224 03:49:44.427267 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6pr5" Feb 24 03:49:44 crc kubenswrapper[4923]: I0224 03:49:44.443012 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d6pr5"] Feb 24 03:49:44 crc kubenswrapper[4923]: I0224 03:49:44.564652 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wmmb\" (UniqueName: \"kubernetes.io/projected/c0fa5d99-86aa-4fbd-88b2-8d81def820f0-kube-api-access-4wmmb\") pod \"redhat-operators-d6pr5\" (UID: \"c0fa5d99-86aa-4fbd-88b2-8d81def820f0\") " pod="openshift-marketplace/redhat-operators-d6pr5" Feb 24 03:49:44 crc kubenswrapper[4923]: I0224 03:49:44.565051 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0fa5d99-86aa-4fbd-88b2-8d81def820f0-catalog-content\") pod \"redhat-operators-d6pr5\" (UID: \"c0fa5d99-86aa-4fbd-88b2-8d81def820f0\") " pod="openshift-marketplace/redhat-operators-d6pr5" Feb 24 03:49:44 crc kubenswrapper[4923]: I0224 03:49:44.565269 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0fa5d99-86aa-4fbd-88b2-8d81def820f0-utilities\") pod \"redhat-operators-d6pr5\" (UID: \"c0fa5d99-86aa-4fbd-88b2-8d81def820f0\") " pod="openshift-marketplace/redhat-operators-d6pr5" Feb 24 03:49:44 crc kubenswrapper[4923]: I0224 03:49:44.667469 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0fa5d99-86aa-4fbd-88b2-8d81def820f0-utilities\") pod \"redhat-operators-d6pr5\" (UID: \"c0fa5d99-86aa-4fbd-88b2-8d81def820f0\") " pod="openshift-marketplace/redhat-operators-d6pr5" Feb 24 03:49:44 crc kubenswrapper[4923]: I0224 03:49:44.667823 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wmmb\" (UniqueName: \"kubernetes.io/projected/c0fa5d99-86aa-4fbd-88b2-8d81def820f0-kube-api-access-4wmmb\") pod \"redhat-operators-d6pr5\" (UID: \"c0fa5d99-86aa-4fbd-88b2-8d81def820f0\") " pod="openshift-marketplace/redhat-operators-d6pr5" Feb 24 03:49:44 crc kubenswrapper[4923]: I0224 03:49:44.668019 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0fa5d99-86aa-4fbd-88b2-8d81def820f0-utilities\") pod \"redhat-operators-d6pr5\" (UID: \"c0fa5d99-86aa-4fbd-88b2-8d81def820f0\") " pod="openshift-marketplace/redhat-operators-d6pr5" Feb 24 03:49:44 crc kubenswrapper[4923]: I0224 03:49:44.668143 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0fa5d99-86aa-4fbd-88b2-8d81def820f0-catalog-content\") pod \"redhat-operators-d6pr5\" (UID: \"c0fa5d99-86aa-4fbd-88b2-8d81def820f0\") " pod="openshift-marketplace/redhat-operators-d6pr5" Feb 24 03:49:44 crc kubenswrapper[4923]: I0224 03:49:44.668454 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0fa5d99-86aa-4fbd-88b2-8d81def820f0-catalog-content\") pod \"redhat-operators-d6pr5\" (UID: \"c0fa5d99-86aa-4fbd-88b2-8d81def820f0\") " pod="openshift-marketplace/redhat-operators-d6pr5" Feb 24 03:49:44 crc kubenswrapper[4923]: I0224 03:49:44.688249 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wmmb\" (UniqueName: \"kubernetes.io/projected/c0fa5d99-86aa-4fbd-88b2-8d81def820f0-kube-api-access-4wmmb\") pod \"redhat-operators-d6pr5\" (UID: \"c0fa5d99-86aa-4fbd-88b2-8d81def820f0\") " pod="openshift-marketplace/redhat-operators-d6pr5" Feb 24 03:49:44 crc kubenswrapper[4923]: I0224 03:49:44.760322 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6pr5" Feb 24 03:49:45 crc kubenswrapper[4923]: I0224 03:49:45.217852 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d6pr5"] Feb 24 03:49:45 crc kubenswrapper[4923]: I0224 03:49:45.523061 4923 generic.go:334] "Generic (PLEG): container finished" podID="c0fa5d99-86aa-4fbd-88b2-8d81def820f0" containerID="bb90ecae55a5c05da7203e88c60594a736324eb4a4c0e654f052b38a5c0d8531" exitCode=0 Feb 24 03:49:45 crc kubenswrapper[4923]: I0224 03:49:45.523183 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6pr5" event={"ID":"c0fa5d99-86aa-4fbd-88b2-8d81def820f0","Type":"ContainerDied","Data":"bb90ecae55a5c05da7203e88c60594a736324eb4a4c0e654f052b38a5c0d8531"} Feb 24 03:49:45 crc kubenswrapper[4923]: I0224 03:49:45.523407 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6pr5" event={"ID":"c0fa5d99-86aa-4fbd-88b2-8d81def820f0","Type":"ContainerStarted","Data":"c0042feb77c8affa9e610afc2cdb1d1db3dff82288dd4b4ba1f254b6478e77c5"} Feb 24 03:49:45 crc kubenswrapper[4923]: I0224 03:49:45.525063 4923 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 03:49:46 crc kubenswrapper[4923]: I0224 03:49:46.545782 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6pr5" event={"ID":"c0fa5d99-86aa-4fbd-88b2-8d81def820f0","Type":"ContainerStarted","Data":"c8e57bb6f8e5a7b0ada597b67d4d582fa5903609e86e342940554a401763e00d"} Feb 24 03:49:47 crc kubenswrapper[4923]: I0224 03:49:47.717859 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:49:47 crc kubenswrapper[4923]: E0224 03:49:47.718436 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:49:48 crc kubenswrapper[4923]: I0224 03:49:48.565250 4923 generic.go:334] "Generic (PLEG): container finished" podID="c0fa5d99-86aa-4fbd-88b2-8d81def820f0" containerID="c8e57bb6f8e5a7b0ada597b67d4d582fa5903609e86e342940554a401763e00d" exitCode=0 Feb 24 03:49:48 crc kubenswrapper[4923]: I0224 03:49:48.565290 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6pr5" event={"ID":"c0fa5d99-86aa-4fbd-88b2-8d81def820f0","Type":"ContainerDied","Data":"c8e57bb6f8e5a7b0ada597b67d4d582fa5903609e86e342940554a401763e00d"} Feb 24 03:49:49 crc kubenswrapper[4923]: I0224 03:49:49.577173 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6pr5" event={"ID":"c0fa5d99-86aa-4fbd-88b2-8d81def820f0","Type":"ContainerStarted","Data":"6aad34337e90b6258998178dd30aa43807ee8a58e2202db3133b0874969f7fc6"} Feb 24 03:49:49 crc kubenswrapper[4923]: I0224 03:49:49.598606 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d6pr5" podStartSLOduration=2.129832588 podStartE2EDuration="5.598583868s" podCreationTimestamp="2026-02-24 03:49:44 +0000 UTC" firstStartedPulling="2026-02-24 03:49:45.524624212 +0000 UTC m=+3309.541695065" lastFinishedPulling="2026-02-24 03:49:48.993375532 +0000 UTC m=+3313.010446345" observedRunningTime="2026-02-24 03:49:49.594280865 +0000 UTC m=+3313.611351678" watchObservedRunningTime="2026-02-24 03:49:49.598583868 +0000 UTC m=+3313.615654681" Feb 24 03:49:54 crc kubenswrapper[4923]: I0224 03:49:54.761477 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d6pr5" Feb 24 03:49:54 crc kubenswrapper[4923]: I0224 03:49:54.762425 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d6pr5" Feb 24 03:49:55 crc kubenswrapper[4923]: I0224 03:49:55.805354 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d6pr5" podUID="c0fa5d99-86aa-4fbd-88b2-8d81def820f0" containerName="registry-server" probeResult="failure" output=< Feb 24 03:49:55 crc kubenswrapper[4923]: timeout: failed to connect service ":50051" within 1s Feb 24 03:49:55 crc kubenswrapper[4923]: > Feb 24 03:49:59 crc kubenswrapper[4923]: I0224 03:49:59.713446 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:49:59 crc kubenswrapper[4923]: E0224 03:49:59.714205 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:50:04 crc kubenswrapper[4923]: I0224 03:50:04.829864 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d6pr5" Feb 24 03:50:04 crc kubenswrapper[4923]: I0224 03:50:04.887180 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d6pr5" Feb 24 03:50:05 crc kubenswrapper[4923]: I0224 03:50:05.066519 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d6pr5"] Feb 24 03:50:06 crc kubenswrapper[4923]: I0224 03:50:06.742631 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d6pr5" podUID="c0fa5d99-86aa-4fbd-88b2-8d81def820f0" containerName="registry-server" containerID="cri-o://6aad34337e90b6258998178dd30aa43807ee8a58e2202db3133b0874969f7fc6" gracePeriod=2 Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.326565 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6pr5" Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.424398 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0fa5d99-86aa-4fbd-88b2-8d81def820f0-catalog-content\") pod \"c0fa5d99-86aa-4fbd-88b2-8d81def820f0\" (UID: \"c0fa5d99-86aa-4fbd-88b2-8d81def820f0\") " Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.424851 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wmmb\" (UniqueName: \"kubernetes.io/projected/c0fa5d99-86aa-4fbd-88b2-8d81def820f0-kube-api-access-4wmmb\") pod \"c0fa5d99-86aa-4fbd-88b2-8d81def820f0\" (UID: \"c0fa5d99-86aa-4fbd-88b2-8d81def820f0\") " Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.424978 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0fa5d99-86aa-4fbd-88b2-8d81def820f0-utilities\") pod \"c0fa5d99-86aa-4fbd-88b2-8d81def820f0\" (UID: \"c0fa5d99-86aa-4fbd-88b2-8d81def820f0\") " Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.425963 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0fa5d99-86aa-4fbd-88b2-8d81def820f0-utilities" (OuterVolumeSpecName: "utilities") pod "c0fa5d99-86aa-4fbd-88b2-8d81def820f0" (UID: "c0fa5d99-86aa-4fbd-88b2-8d81def820f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.434670 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0fa5d99-86aa-4fbd-88b2-8d81def820f0-kube-api-access-4wmmb" (OuterVolumeSpecName: "kube-api-access-4wmmb") pod "c0fa5d99-86aa-4fbd-88b2-8d81def820f0" (UID: "c0fa5d99-86aa-4fbd-88b2-8d81def820f0"). InnerVolumeSpecName "kube-api-access-4wmmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.527980 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wmmb\" (UniqueName: \"kubernetes.io/projected/c0fa5d99-86aa-4fbd-88b2-8d81def820f0-kube-api-access-4wmmb\") on node \"crc\" DevicePath \"\"" Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.528012 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0fa5d99-86aa-4fbd-88b2-8d81def820f0-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.552474 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0fa5d99-86aa-4fbd-88b2-8d81def820f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0fa5d99-86aa-4fbd-88b2-8d81def820f0" (UID: "c0fa5d99-86aa-4fbd-88b2-8d81def820f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.630255 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0fa5d99-86aa-4fbd-88b2-8d81def820f0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.750428 4923 generic.go:334] "Generic (PLEG): container finished" podID="c0fa5d99-86aa-4fbd-88b2-8d81def820f0" containerID="6aad34337e90b6258998178dd30aa43807ee8a58e2202db3133b0874969f7fc6" exitCode=0 Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.750483 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6pr5" event={"ID":"c0fa5d99-86aa-4fbd-88b2-8d81def820f0","Type":"ContainerDied","Data":"6aad34337e90b6258998178dd30aa43807ee8a58e2202db3133b0874969f7fc6"} Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.750514 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6pr5" event={"ID":"c0fa5d99-86aa-4fbd-88b2-8d81def820f0","Type":"ContainerDied","Data":"c0042feb77c8affa9e610afc2cdb1d1db3dff82288dd4b4ba1f254b6478e77c5"} Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.750538 4923 scope.go:117] "RemoveContainer" containerID="6aad34337e90b6258998178dd30aa43807ee8a58e2202db3133b0874969f7fc6" Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.750706 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6pr5" Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.787446 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d6pr5"] Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.798658 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d6pr5"] Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.803394 4923 scope.go:117] "RemoveContainer" containerID="c8e57bb6f8e5a7b0ada597b67d4d582fa5903609e86e342940554a401763e00d" Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.837955 4923 scope.go:117] "RemoveContainer" containerID="bb90ecae55a5c05da7203e88c60594a736324eb4a4c0e654f052b38a5c0d8531" Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.879647 4923 scope.go:117] "RemoveContainer" containerID="6aad34337e90b6258998178dd30aa43807ee8a58e2202db3133b0874969f7fc6" Feb 24 03:50:07 crc kubenswrapper[4923]: E0224 03:50:07.880105 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aad34337e90b6258998178dd30aa43807ee8a58e2202db3133b0874969f7fc6\": container with ID starting with 6aad34337e90b6258998178dd30aa43807ee8a58e2202db3133b0874969f7fc6 not found: ID does not exist" containerID="6aad34337e90b6258998178dd30aa43807ee8a58e2202db3133b0874969f7fc6" Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.880145 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aad34337e90b6258998178dd30aa43807ee8a58e2202db3133b0874969f7fc6"} err="failed to get container status \"6aad34337e90b6258998178dd30aa43807ee8a58e2202db3133b0874969f7fc6\": rpc error: code = NotFound desc = could not find container \"6aad34337e90b6258998178dd30aa43807ee8a58e2202db3133b0874969f7fc6\": container with ID starting with 6aad34337e90b6258998178dd30aa43807ee8a58e2202db3133b0874969f7fc6 not found: ID does not exist" Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.880174 4923 scope.go:117] "RemoveContainer" containerID="c8e57bb6f8e5a7b0ada597b67d4d582fa5903609e86e342940554a401763e00d" Feb 24 03:50:07 crc kubenswrapper[4923]: E0224 03:50:07.885233 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8e57bb6f8e5a7b0ada597b67d4d582fa5903609e86e342940554a401763e00d\": container with ID starting with c8e57bb6f8e5a7b0ada597b67d4d582fa5903609e86e342940554a401763e00d not found: ID does not exist" containerID="c8e57bb6f8e5a7b0ada597b67d4d582fa5903609e86e342940554a401763e00d" Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.885293 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8e57bb6f8e5a7b0ada597b67d4d582fa5903609e86e342940554a401763e00d"} err="failed to get container status \"c8e57bb6f8e5a7b0ada597b67d4d582fa5903609e86e342940554a401763e00d\": rpc error: code = NotFound desc = could not find container \"c8e57bb6f8e5a7b0ada597b67d4d582fa5903609e86e342940554a401763e00d\": container with ID starting with c8e57bb6f8e5a7b0ada597b67d4d582fa5903609e86e342940554a401763e00d not found: ID does not exist" Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.885349 4923 scope.go:117] "RemoveContainer" containerID="bb90ecae55a5c05da7203e88c60594a736324eb4a4c0e654f052b38a5c0d8531" Feb 24 03:50:07 crc kubenswrapper[4923]: E0224 03:50:07.885831 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb90ecae55a5c05da7203e88c60594a736324eb4a4c0e654f052b38a5c0d8531\": container with ID starting with bb90ecae55a5c05da7203e88c60594a736324eb4a4c0e654f052b38a5c0d8531 not found: ID does not exist" containerID="bb90ecae55a5c05da7203e88c60594a736324eb4a4c0e654f052b38a5c0d8531" Feb 24 03:50:07 crc kubenswrapper[4923]: I0224 03:50:07.885903 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb90ecae55a5c05da7203e88c60594a736324eb4a4c0e654f052b38a5c0d8531"} err="failed to get container status \"bb90ecae55a5c05da7203e88c60594a736324eb4a4c0e654f052b38a5c0d8531\": rpc error: code = NotFound desc = could not find container \"bb90ecae55a5c05da7203e88c60594a736324eb4a4c0e654f052b38a5c0d8531\": container with ID starting with bb90ecae55a5c05da7203e88c60594a736324eb4a4c0e654f052b38a5c0d8531 not found: ID does not exist" Feb 24 03:50:09 crc kubenswrapper[4923]: I0224 03:50:09.724351 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0fa5d99-86aa-4fbd-88b2-8d81def820f0" path="/var/lib/kubelet/pods/c0fa5d99-86aa-4fbd-88b2-8d81def820f0/volumes" Feb 24 03:50:12 crc kubenswrapper[4923]: I0224 03:50:12.713762 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:50:12 crc kubenswrapper[4923]: E0224 03:50:12.714262 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:50:13 crc kubenswrapper[4923]: I0224 03:50:13.920608 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l52zx"] Feb 24 03:50:13 crc kubenswrapper[4923]: E0224 03:50:13.921195 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0fa5d99-86aa-4fbd-88b2-8d81def820f0" containerName="extract-utilities" Feb 24 03:50:13 crc kubenswrapper[4923]: I0224 03:50:13.921207 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0fa5d99-86aa-4fbd-88b2-8d81def820f0" containerName="extract-utilities" Feb 24 03:50:13 crc kubenswrapper[4923]: E0224 03:50:13.921224 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0fa5d99-86aa-4fbd-88b2-8d81def820f0" containerName="registry-server" Feb 24 03:50:13 crc kubenswrapper[4923]: I0224 03:50:13.921231 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0fa5d99-86aa-4fbd-88b2-8d81def820f0" containerName="registry-server" Feb 24 03:50:13 crc kubenswrapper[4923]: E0224 03:50:13.921258 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0fa5d99-86aa-4fbd-88b2-8d81def820f0" containerName="extract-content" Feb 24 03:50:13 crc kubenswrapper[4923]: I0224 03:50:13.921264 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0fa5d99-86aa-4fbd-88b2-8d81def820f0" containerName="extract-content" Feb 24 03:50:13 crc kubenswrapper[4923]: I0224 03:50:13.921451 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0fa5d99-86aa-4fbd-88b2-8d81def820f0" containerName="registry-server" Feb 24 03:50:13 crc kubenswrapper[4923]: I0224 03:50:13.922663 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l52zx" Feb 24 03:50:13 crc kubenswrapper[4923]: I0224 03:50:13.939186 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l52zx"] Feb 24 03:50:14 crc kubenswrapper[4923]: I0224 03:50:14.056949 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/863c713d-6d19-468a-b762-e7b847400478-catalog-content\") pod \"redhat-marketplace-l52zx\" (UID: \"863c713d-6d19-468a-b762-e7b847400478\") " pod="openshift-marketplace/redhat-marketplace-l52zx" Feb 24 03:50:14 crc kubenswrapper[4923]: I0224 03:50:14.057037 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqpkv\" (UniqueName: \"kubernetes.io/projected/863c713d-6d19-468a-b762-e7b847400478-kube-api-access-wqpkv\") pod \"redhat-marketplace-l52zx\" (UID: \"863c713d-6d19-468a-b762-e7b847400478\") " pod="openshift-marketplace/redhat-marketplace-l52zx" Feb 24 03:50:14 crc kubenswrapper[4923]: I0224 03:50:14.057165 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/863c713d-6d19-468a-b762-e7b847400478-utilities\") pod \"redhat-marketplace-l52zx\" (UID: \"863c713d-6d19-468a-b762-e7b847400478\") " pod="openshift-marketplace/redhat-marketplace-l52zx" Feb 24 03:50:14 crc kubenswrapper[4923]: I0224 03:50:14.158844 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/863c713d-6d19-468a-b762-e7b847400478-catalog-content\") pod \"redhat-marketplace-l52zx\" (UID: \"863c713d-6d19-468a-b762-e7b847400478\") " pod="openshift-marketplace/redhat-marketplace-l52zx" Feb 24 03:50:14 crc kubenswrapper[4923]: I0224 03:50:14.158918 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqpkv\" (UniqueName: \"kubernetes.io/projected/863c713d-6d19-468a-b762-e7b847400478-kube-api-access-wqpkv\") pod \"redhat-marketplace-l52zx\" (UID: \"863c713d-6d19-468a-b762-e7b847400478\") " pod="openshift-marketplace/redhat-marketplace-l52zx" Feb 24 03:50:14 crc kubenswrapper[4923]: I0224 03:50:14.158996 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/863c713d-6d19-468a-b762-e7b847400478-utilities\") pod \"redhat-marketplace-l52zx\" (UID: \"863c713d-6d19-468a-b762-e7b847400478\") " pod="openshift-marketplace/redhat-marketplace-l52zx" Feb 24 03:50:14 crc kubenswrapper[4923]: I0224 03:50:14.159544 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/863c713d-6d19-468a-b762-e7b847400478-utilities\") pod \"redhat-marketplace-l52zx\" (UID: \"863c713d-6d19-468a-b762-e7b847400478\") " pod="openshift-marketplace/redhat-marketplace-l52zx" Feb 24 03:50:14 crc kubenswrapper[4923]: I0224 03:50:14.159838 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/863c713d-6d19-468a-b762-e7b847400478-catalog-content\") pod \"redhat-marketplace-l52zx\" (UID: \"863c713d-6d19-468a-b762-e7b847400478\") " pod="openshift-marketplace/redhat-marketplace-l52zx" Feb 24 03:50:14 crc kubenswrapper[4923]: I0224 03:50:14.179166 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqpkv\" (UniqueName: \"kubernetes.io/projected/863c713d-6d19-468a-b762-e7b847400478-kube-api-access-wqpkv\") pod \"redhat-marketplace-l52zx\" (UID: \"863c713d-6d19-468a-b762-e7b847400478\") " pod="openshift-marketplace/redhat-marketplace-l52zx" Feb 24 03:50:14 crc kubenswrapper[4923]: I0224 03:50:14.243367 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l52zx" Feb 24 03:50:14 crc kubenswrapper[4923]: I0224 03:50:14.679145 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l52zx"] Feb 24 03:50:14 crc kubenswrapper[4923]: I0224 03:50:14.811203 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l52zx" event={"ID":"863c713d-6d19-468a-b762-e7b847400478","Type":"ContainerStarted","Data":"cc794c3c0c78b6cb29a727fe0dc6f9fe1812f49122c943233578af194a69e1dc"} Feb 24 03:50:15 crc kubenswrapper[4923]: I0224 03:50:15.820942 4923 generic.go:334] "Generic (PLEG): container finished" podID="863c713d-6d19-468a-b762-e7b847400478" containerID="ec4a41e2d2483a62fb9fbb92f8da133e9d27b007b436f3fb5aa192e18c4c9485" exitCode=0 Feb 24 03:50:15 crc kubenswrapper[4923]: I0224 03:50:15.821009 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l52zx" event={"ID":"863c713d-6d19-468a-b762-e7b847400478","Type":"ContainerDied","Data":"ec4a41e2d2483a62fb9fbb92f8da133e9d27b007b436f3fb5aa192e18c4c9485"} Feb 24 03:50:16 crc kubenswrapper[4923]: I0224 03:50:16.829644 4923 generic.go:334] "Generic (PLEG): container finished" podID="863c713d-6d19-468a-b762-e7b847400478" containerID="9c04e64b721cf86176103ac3d6f3abfccb8cf937ee4ccc783f3784f1ab6f1380" exitCode=0 Feb 24 03:50:16 crc kubenswrapper[4923]: I0224 03:50:16.829699 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l52zx" event={"ID":"863c713d-6d19-468a-b762-e7b847400478","Type":"ContainerDied","Data":"9c04e64b721cf86176103ac3d6f3abfccb8cf937ee4ccc783f3784f1ab6f1380"} Feb 24 03:50:17 crc kubenswrapper[4923]: I0224 03:50:17.839223 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l52zx" event={"ID":"863c713d-6d19-468a-b762-e7b847400478","Type":"ContainerStarted","Data":"837f90576fc6c1be0b1f24541d329ac6d1a3af4ff2073064a33ecc4863c9d661"} Feb 24 03:50:17 crc kubenswrapper[4923]: I0224 03:50:17.860692 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l52zx" podStartSLOduration=3.475113483 podStartE2EDuration="4.8606724s" podCreationTimestamp="2026-02-24 03:50:13 +0000 UTC" firstStartedPulling="2026-02-24 03:50:15.82360742 +0000 UTC m=+3339.840678233" lastFinishedPulling="2026-02-24 03:50:17.209166327 +0000 UTC m=+3341.226237150" observedRunningTime="2026-02-24 03:50:17.856718566 +0000 UTC m=+3341.873789389" watchObservedRunningTime="2026-02-24 03:50:17.8606724 +0000 UTC m=+3341.877743213" Feb 24 03:50:20 crc kubenswrapper[4923]: I0224 03:50:20.315931 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6qvld"] Feb 24 03:50:20 crc kubenswrapper[4923]: I0224 03:50:20.318569 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qvld" Feb 24 03:50:20 crc kubenswrapper[4923]: I0224 03:50:20.349262 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6qvld"] Feb 24 03:50:20 crc kubenswrapper[4923]: I0224 03:50:20.466434 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwt9c\" (UniqueName: \"kubernetes.io/projected/24724fb8-9a44-4400-90c9-7e8999165fb3-kube-api-access-nwt9c\") pod \"certified-operators-6qvld\" (UID: \"24724fb8-9a44-4400-90c9-7e8999165fb3\") " pod="openshift-marketplace/certified-operators-6qvld" Feb 24 03:50:20 crc kubenswrapper[4923]: I0224 03:50:20.466569 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24724fb8-9a44-4400-90c9-7e8999165fb3-utilities\") pod \"certified-operators-6qvld\" (UID: \"24724fb8-9a44-4400-90c9-7e8999165fb3\") " pod="openshift-marketplace/certified-operators-6qvld" Feb 24 03:50:20 crc kubenswrapper[4923]: I0224 03:50:20.466638 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24724fb8-9a44-4400-90c9-7e8999165fb3-catalog-content\") pod \"certified-operators-6qvld\" (UID: \"24724fb8-9a44-4400-90c9-7e8999165fb3\") " pod="openshift-marketplace/certified-operators-6qvld" Feb 24 03:50:20 crc kubenswrapper[4923]: I0224 03:50:20.568268 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24724fb8-9a44-4400-90c9-7e8999165fb3-utilities\") pod \"certified-operators-6qvld\" (UID: \"24724fb8-9a44-4400-90c9-7e8999165fb3\") " pod="openshift-marketplace/certified-operators-6qvld" Feb 24 03:50:20 crc kubenswrapper[4923]: I0224 03:50:20.568712 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24724fb8-9a44-4400-90c9-7e8999165fb3-catalog-content\") pod \"certified-operators-6qvld\" (UID: \"24724fb8-9a44-4400-90c9-7e8999165fb3\") " pod="openshift-marketplace/certified-operators-6qvld" Feb 24 03:50:20 crc kubenswrapper[4923]: I0224 03:50:20.568825 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24724fb8-9a44-4400-90c9-7e8999165fb3-utilities\") pod \"certified-operators-6qvld\" (UID: \"24724fb8-9a44-4400-90c9-7e8999165fb3\") " pod="openshift-marketplace/certified-operators-6qvld" Feb 24 03:50:20 crc kubenswrapper[4923]: I0224 03:50:20.568882 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwt9c\" (UniqueName: \"kubernetes.io/projected/24724fb8-9a44-4400-90c9-7e8999165fb3-kube-api-access-nwt9c\") pod \"certified-operators-6qvld\" (UID: \"24724fb8-9a44-4400-90c9-7e8999165fb3\") " pod="openshift-marketplace/certified-operators-6qvld" Feb 24 03:50:20 crc kubenswrapper[4923]: I0224 03:50:20.569155 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24724fb8-9a44-4400-90c9-7e8999165fb3-catalog-content\") pod \"certified-operators-6qvld\" (UID: \"24724fb8-9a44-4400-90c9-7e8999165fb3\") " pod="openshift-marketplace/certified-operators-6qvld" Feb 24 03:50:20 crc kubenswrapper[4923]: I0224 03:50:20.591331 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwt9c\" (UniqueName: \"kubernetes.io/projected/24724fb8-9a44-4400-90c9-7e8999165fb3-kube-api-access-nwt9c\") pod \"certified-operators-6qvld\" (UID: \"24724fb8-9a44-4400-90c9-7e8999165fb3\") " pod="openshift-marketplace/certified-operators-6qvld" Feb 24 03:50:20 crc kubenswrapper[4923]: I0224 03:50:20.644797 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qvld" Feb 24 03:50:21 crc kubenswrapper[4923]: I0224 03:50:21.168748 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6qvld"] Feb 24 03:50:21 crc kubenswrapper[4923]: W0224 03:50:21.188816 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24724fb8_9a44_4400_90c9_7e8999165fb3.slice/crio-e18730afe4892ecbdb59a11379aa38cb87b693eae1bbe8397a699c9d61c645a9 WatchSource:0}: Error finding container e18730afe4892ecbdb59a11379aa38cb87b693eae1bbe8397a699c9d61c645a9: Status 404 returned error can't find the container with id e18730afe4892ecbdb59a11379aa38cb87b693eae1bbe8397a699c9d61c645a9 Feb 24 03:50:21 crc kubenswrapper[4923]: I0224 03:50:21.879904 4923 generic.go:334] "Generic (PLEG): container finished" podID="24724fb8-9a44-4400-90c9-7e8999165fb3" containerID="cb6c9706d9139e8f0d293ff73b10df2e6835429d523446eef06d1d6a1e4b7181" exitCode=0 Feb 24 03:50:21 crc kubenswrapper[4923]: I0224 03:50:21.880003 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qvld" event={"ID":"24724fb8-9a44-4400-90c9-7e8999165fb3","Type":"ContainerDied","Data":"cb6c9706d9139e8f0d293ff73b10df2e6835429d523446eef06d1d6a1e4b7181"} Feb 24 03:50:21 crc kubenswrapper[4923]: I0224 03:50:21.880342 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qvld" event={"ID":"24724fb8-9a44-4400-90c9-7e8999165fb3","Type":"ContainerStarted","Data":"e18730afe4892ecbdb59a11379aa38cb87b693eae1bbe8397a699c9d61c645a9"} Feb 24 03:50:23 crc kubenswrapper[4923]: I0224 03:50:23.907606 4923 generic.go:334] "Generic (PLEG): container finished" podID="24724fb8-9a44-4400-90c9-7e8999165fb3" containerID="0e43d42f14014c4a31070a1b89e194077686225407f1421dd8d1f22173ce2b4f" exitCode=0 Feb 24 03:50:23 crc kubenswrapper[4923]: I0224 03:50:23.907714 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qvld" event={"ID":"24724fb8-9a44-4400-90c9-7e8999165fb3","Type":"ContainerDied","Data":"0e43d42f14014c4a31070a1b89e194077686225407f1421dd8d1f22173ce2b4f"} Feb 24 03:50:24 crc kubenswrapper[4923]: I0224 03:50:24.243969 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l52zx" Feb 24 03:50:24 crc kubenswrapper[4923]: I0224 03:50:24.244040 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l52zx" Feb 24 03:50:24 crc kubenswrapper[4923]: I0224 03:50:24.311256 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l52zx" Feb 24 03:50:25 crc kubenswrapper[4923]: I0224 03:50:25.249968 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l52zx" Feb 24 03:50:25 crc kubenswrapper[4923]: I0224 03:50:25.500615 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l52zx"] Feb 24 03:50:26 crc kubenswrapper[4923]: I0224 03:50:26.176809 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qvld" event={"ID":"24724fb8-9a44-4400-90c9-7e8999165fb3","Type":"ContainerStarted","Data":"a65fd811bde2209053f67e66ef136b0f10e2ea4ffc9f73afb32488a5a9a421c8"} Feb 24 03:50:26 crc kubenswrapper[4923]: I0224 03:50:26.196119 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6qvld" podStartSLOduration=3.792006191 podStartE2EDuration="6.196092358s" podCreationTimestamp="2026-02-24 03:50:20 +0000 UTC" firstStartedPulling="2026-02-24 03:50:21.881726505 +0000 UTC m=+3345.898797318" lastFinishedPulling="2026-02-24 03:50:24.285812652 +0000 UTC m=+3348.302883485" observedRunningTime="2026-02-24 03:50:26.191873198 +0000 UTC m=+3350.208944011" watchObservedRunningTime="2026-02-24 03:50:26.196092358 +0000 UTC m=+3350.213163211" Feb 24 03:50:26 crc kubenswrapper[4923]: I0224 03:50:26.713227 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:50:26 crc kubenswrapper[4923]: E0224 03:50:26.713664 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:50:27 crc kubenswrapper[4923]: I0224 03:50:27.184751 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l52zx" podUID="863c713d-6d19-468a-b762-e7b847400478" containerName="registry-server" containerID="cri-o://837f90576fc6c1be0b1f24541d329ac6d1a3af4ff2073064a33ecc4863c9d661" gracePeriod=2 Feb 24 03:50:27 crc kubenswrapper[4923]: I0224 03:50:27.690232 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l52zx" Feb 24 03:50:27 crc kubenswrapper[4923]: I0224 03:50:27.880110 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqpkv\" (UniqueName: \"kubernetes.io/projected/863c713d-6d19-468a-b762-e7b847400478-kube-api-access-wqpkv\") pod \"863c713d-6d19-468a-b762-e7b847400478\" (UID: \"863c713d-6d19-468a-b762-e7b847400478\") " Feb 24 03:50:27 crc kubenswrapper[4923]: I0224 03:50:27.880201 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/863c713d-6d19-468a-b762-e7b847400478-catalog-content\") pod \"863c713d-6d19-468a-b762-e7b847400478\" (UID: \"863c713d-6d19-468a-b762-e7b847400478\") " Feb 24 03:50:27 crc kubenswrapper[4923]: I0224 03:50:27.880377 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/863c713d-6d19-468a-b762-e7b847400478-utilities\") pod \"863c713d-6d19-468a-b762-e7b847400478\" (UID: \"863c713d-6d19-468a-b762-e7b847400478\") " Feb 24 03:50:27 crc kubenswrapper[4923]: I0224 03:50:27.887638 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/863c713d-6d19-468a-b762-e7b847400478-utilities" (OuterVolumeSpecName: "utilities") pod "863c713d-6d19-468a-b762-e7b847400478" (UID: "863c713d-6d19-468a-b762-e7b847400478"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:50:27 crc kubenswrapper[4923]: I0224 03:50:27.889421 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/863c713d-6d19-468a-b762-e7b847400478-kube-api-access-wqpkv" (OuterVolumeSpecName: "kube-api-access-wqpkv") pod "863c713d-6d19-468a-b762-e7b847400478" (UID: "863c713d-6d19-468a-b762-e7b847400478"). InnerVolumeSpecName "kube-api-access-wqpkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:50:27 crc kubenswrapper[4923]: I0224 03:50:27.908561 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/863c713d-6d19-468a-b762-e7b847400478-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "863c713d-6d19-468a-b762-e7b847400478" (UID: "863c713d-6d19-468a-b762-e7b847400478"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:50:27 crc kubenswrapper[4923]: I0224 03:50:27.983367 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/863c713d-6d19-468a-b762-e7b847400478-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:50:27 crc kubenswrapper[4923]: I0224 03:50:27.983400 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqpkv\" (UniqueName: \"kubernetes.io/projected/863c713d-6d19-468a-b762-e7b847400478-kube-api-access-wqpkv\") on node \"crc\" DevicePath \"\"" Feb 24 03:50:27 crc kubenswrapper[4923]: I0224 03:50:27.983412 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/863c713d-6d19-468a-b762-e7b847400478-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:50:28 crc kubenswrapper[4923]: I0224 03:50:28.199434 4923 generic.go:334] "Generic (PLEG): container finished" podID="863c713d-6d19-468a-b762-e7b847400478" containerID="837f90576fc6c1be0b1f24541d329ac6d1a3af4ff2073064a33ecc4863c9d661" exitCode=0 Feb 24 03:50:28 crc kubenswrapper[4923]: I0224 03:50:28.199490 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l52zx" event={"ID":"863c713d-6d19-468a-b762-e7b847400478","Type":"ContainerDied","Data":"837f90576fc6c1be0b1f24541d329ac6d1a3af4ff2073064a33ecc4863c9d661"} Feb 24 03:50:28 crc kubenswrapper[4923]: I0224 03:50:28.199563 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l52zx" event={"ID":"863c713d-6d19-468a-b762-e7b847400478","Type":"ContainerDied","Data":"cc794c3c0c78b6cb29a727fe0dc6f9fe1812f49122c943233578af194a69e1dc"} Feb 24 03:50:28 crc kubenswrapper[4923]: I0224 03:50:28.199557 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l52zx" Feb 24 03:50:28 crc kubenswrapper[4923]: I0224 03:50:28.199581 4923 scope.go:117] "RemoveContainer" containerID="837f90576fc6c1be0b1f24541d329ac6d1a3af4ff2073064a33ecc4863c9d661" Feb 24 03:50:28 crc kubenswrapper[4923]: I0224 03:50:28.231428 4923 scope.go:117] "RemoveContainer" containerID="9c04e64b721cf86176103ac3d6f3abfccb8cf937ee4ccc783f3784f1ab6f1380" Feb 24 03:50:28 crc kubenswrapper[4923]: I0224 03:50:28.250486 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l52zx"] Feb 24 03:50:28 crc kubenswrapper[4923]: I0224 03:50:28.260570 4923 scope.go:117] "RemoveContainer" containerID="ec4a41e2d2483a62fb9fbb92f8da133e9d27b007b436f3fb5aa192e18c4c9485" Feb 24 03:50:28 crc kubenswrapper[4923]: I0224 03:50:28.264704 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l52zx"] Feb 24 03:50:28 crc kubenswrapper[4923]: I0224 03:50:28.319468 4923 scope.go:117] "RemoveContainer" containerID="837f90576fc6c1be0b1f24541d329ac6d1a3af4ff2073064a33ecc4863c9d661" Feb 24 03:50:28 crc kubenswrapper[4923]: E0224 03:50:28.323410 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"837f90576fc6c1be0b1f24541d329ac6d1a3af4ff2073064a33ecc4863c9d661\": container with ID starting with 837f90576fc6c1be0b1f24541d329ac6d1a3af4ff2073064a33ecc4863c9d661 not found: ID does not exist" containerID="837f90576fc6c1be0b1f24541d329ac6d1a3af4ff2073064a33ecc4863c9d661" Feb 24 03:50:28 crc kubenswrapper[4923]: I0224 03:50:28.323447 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"837f90576fc6c1be0b1f24541d329ac6d1a3af4ff2073064a33ecc4863c9d661"} err="failed to get container status \"837f90576fc6c1be0b1f24541d329ac6d1a3af4ff2073064a33ecc4863c9d661\": rpc error: code = NotFound desc = could not find container \"837f90576fc6c1be0b1f24541d329ac6d1a3af4ff2073064a33ecc4863c9d661\": container with ID starting with 837f90576fc6c1be0b1f24541d329ac6d1a3af4ff2073064a33ecc4863c9d661 not found: ID does not exist" Feb 24 03:50:28 crc kubenswrapper[4923]: I0224 03:50:28.323472 4923 scope.go:117] "RemoveContainer" containerID="9c04e64b721cf86176103ac3d6f3abfccb8cf937ee4ccc783f3784f1ab6f1380" Feb 24 03:50:28 crc kubenswrapper[4923]: E0224 03:50:28.323713 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c04e64b721cf86176103ac3d6f3abfccb8cf937ee4ccc783f3784f1ab6f1380\": container with ID starting with 9c04e64b721cf86176103ac3d6f3abfccb8cf937ee4ccc783f3784f1ab6f1380 not found: ID does not exist" containerID="9c04e64b721cf86176103ac3d6f3abfccb8cf937ee4ccc783f3784f1ab6f1380" Feb 24 03:50:28 crc kubenswrapper[4923]: I0224 03:50:28.323733 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c04e64b721cf86176103ac3d6f3abfccb8cf937ee4ccc783f3784f1ab6f1380"} err="failed to get container status \"9c04e64b721cf86176103ac3d6f3abfccb8cf937ee4ccc783f3784f1ab6f1380\": rpc error: code = NotFound desc = could not find container \"9c04e64b721cf86176103ac3d6f3abfccb8cf937ee4ccc783f3784f1ab6f1380\": container with ID starting with 9c04e64b721cf86176103ac3d6f3abfccb8cf937ee4ccc783f3784f1ab6f1380 not found: ID does not exist" Feb 24 03:50:28 crc kubenswrapper[4923]: I0224 03:50:28.323747 4923 scope.go:117] "RemoveContainer" containerID="ec4a41e2d2483a62fb9fbb92f8da133e9d27b007b436f3fb5aa192e18c4c9485" Feb 24 03:50:28 crc kubenswrapper[4923]: E0224 03:50:28.324048 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec4a41e2d2483a62fb9fbb92f8da133e9d27b007b436f3fb5aa192e18c4c9485\": container with ID starting with ec4a41e2d2483a62fb9fbb92f8da133e9d27b007b436f3fb5aa192e18c4c9485 not found: ID does not exist" containerID="ec4a41e2d2483a62fb9fbb92f8da133e9d27b007b436f3fb5aa192e18c4c9485" Feb 24 03:50:28 crc kubenswrapper[4923]: I0224 03:50:28.324067 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec4a41e2d2483a62fb9fbb92f8da133e9d27b007b436f3fb5aa192e18c4c9485"} err="failed to get container status \"ec4a41e2d2483a62fb9fbb92f8da133e9d27b007b436f3fb5aa192e18c4c9485\": rpc error: code = NotFound desc = could not find container \"ec4a41e2d2483a62fb9fbb92f8da133e9d27b007b436f3fb5aa192e18c4c9485\": container with ID starting with ec4a41e2d2483a62fb9fbb92f8da133e9d27b007b436f3fb5aa192e18c4c9485 not found: ID does not exist" Feb 24 03:50:29 crc kubenswrapper[4923]: I0224 03:50:29.726262 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="863c713d-6d19-468a-b762-e7b847400478" path="/var/lib/kubelet/pods/863c713d-6d19-468a-b762-e7b847400478/volumes" Feb 24 03:50:30 crc kubenswrapper[4923]: I0224 03:50:30.646077 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6qvld" Feb 24 03:50:30 crc kubenswrapper[4923]: I0224 03:50:30.646427 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6qvld" Feb 24 03:50:30 crc kubenswrapper[4923]: I0224 03:50:30.711727 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6qvld" Feb 24 03:50:31 crc kubenswrapper[4923]: I0224 03:50:31.283274 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6qvld" Feb 24 03:50:31 crc kubenswrapper[4923]: I0224 03:50:31.899836 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6qvld"] Feb 24 03:50:33 crc kubenswrapper[4923]: I0224 03:50:33.244199 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6qvld" podUID="24724fb8-9a44-4400-90c9-7e8999165fb3" containerName="registry-server" containerID="cri-o://a65fd811bde2209053f67e66ef136b0f10e2ea4ffc9f73afb32488a5a9a421c8" gracePeriod=2 Feb 24 03:50:34 crc kubenswrapper[4923]: I0224 03:50:34.259949 4923 generic.go:334] "Generic (PLEG): container finished" podID="24724fb8-9a44-4400-90c9-7e8999165fb3" containerID="a65fd811bde2209053f67e66ef136b0f10e2ea4ffc9f73afb32488a5a9a421c8" exitCode=0 Feb 24 03:50:34 crc kubenswrapper[4923]: I0224 03:50:34.260032 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qvld" event={"ID":"24724fb8-9a44-4400-90c9-7e8999165fb3","Type":"ContainerDied","Data":"a65fd811bde2209053f67e66ef136b0f10e2ea4ffc9f73afb32488a5a9a421c8"} Feb 24 03:50:34 crc kubenswrapper[4923]: I0224 03:50:34.260282 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qvld" event={"ID":"24724fb8-9a44-4400-90c9-7e8999165fb3","Type":"ContainerDied","Data":"e18730afe4892ecbdb59a11379aa38cb87b693eae1bbe8397a699c9d61c645a9"} Feb 24 03:50:34 crc kubenswrapper[4923]: I0224 03:50:34.260302 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e18730afe4892ecbdb59a11379aa38cb87b693eae1bbe8397a699c9d61c645a9" Feb 24 03:50:34 crc kubenswrapper[4923]: I0224 03:50:34.309204 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qvld" Feb 24 03:50:34 crc kubenswrapper[4923]: I0224 03:50:34.409794 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwt9c\" (UniqueName: \"kubernetes.io/projected/24724fb8-9a44-4400-90c9-7e8999165fb3-kube-api-access-nwt9c\") pod \"24724fb8-9a44-4400-90c9-7e8999165fb3\" (UID: \"24724fb8-9a44-4400-90c9-7e8999165fb3\") " Feb 24 03:50:34 crc kubenswrapper[4923]: I0224 03:50:34.409873 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24724fb8-9a44-4400-90c9-7e8999165fb3-utilities\") pod \"24724fb8-9a44-4400-90c9-7e8999165fb3\" (UID: \"24724fb8-9a44-4400-90c9-7e8999165fb3\") " Feb 24 03:50:34 crc kubenswrapper[4923]: I0224 03:50:34.410042 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24724fb8-9a44-4400-90c9-7e8999165fb3-catalog-content\") pod \"24724fb8-9a44-4400-90c9-7e8999165fb3\" (UID: \"24724fb8-9a44-4400-90c9-7e8999165fb3\") " Feb 24 03:50:34 crc kubenswrapper[4923]: I0224 03:50:34.410839 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24724fb8-9a44-4400-90c9-7e8999165fb3-utilities" (OuterVolumeSpecName: "utilities") pod "24724fb8-9a44-4400-90c9-7e8999165fb3" (UID: "24724fb8-9a44-4400-90c9-7e8999165fb3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:50:34 crc kubenswrapper[4923]: I0224 03:50:34.415632 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24724fb8-9a44-4400-90c9-7e8999165fb3-kube-api-access-nwt9c" (OuterVolumeSpecName: "kube-api-access-nwt9c") pod "24724fb8-9a44-4400-90c9-7e8999165fb3" (UID: "24724fb8-9a44-4400-90c9-7e8999165fb3"). InnerVolumeSpecName "kube-api-access-nwt9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:50:34 crc kubenswrapper[4923]: I0224 03:50:34.470524 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24724fb8-9a44-4400-90c9-7e8999165fb3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24724fb8-9a44-4400-90c9-7e8999165fb3" (UID: "24724fb8-9a44-4400-90c9-7e8999165fb3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:50:34 crc kubenswrapper[4923]: I0224 03:50:34.512506 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24724fb8-9a44-4400-90c9-7e8999165fb3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:50:34 crc kubenswrapper[4923]: I0224 03:50:34.512539 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwt9c\" (UniqueName: \"kubernetes.io/projected/24724fb8-9a44-4400-90c9-7e8999165fb3-kube-api-access-nwt9c\") on node \"crc\" DevicePath \"\"" Feb 24 03:50:34 crc kubenswrapper[4923]: I0224 03:50:34.512551 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24724fb8-9a44-4400-90c9-7e8999165fb3-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:50:35 crc kubenswrapper[4923]: I0224 03:50:35.267502 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qvld" Feb 24 03:50:35 crc kubenswrapper[4923]: I0224 03:50:35.304757 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6qvld"] Feb 24 03:50:35 crc kubenswrapper[4923]: I0224 03:50:35.315952 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6qvld"] Feb 24 03:50:35 crc kubenswrapper[4923]: I0224 03:50:35.733371 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24724fb8-9a44-4400-90c9-7e8999165fb3" path="/var/lib/kubelet/pods/24724fb8-9a44-4400-90c9-7e8999165fb3/volumes" Feb 24 03:50:38 crc kubenswrapper[4923]: I0224 03:50:38.713406 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:50:38 crc kubenswrapper[4923]: E0224 03:50:38.714285 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:50:52 crc kubenswrapper[4923]: I0224 03:50:52.712882 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:50:52 crc kubenswrapper[4923]: E0224 03:50:52.713939 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:50:57 crc kubenswrapper[4923]: I0224 03:50:57.465566 4923 generic.go:334] "Generic (PLEG): container finished" podID="9b6f2b0b-f8d2-4a36-a1a2-177dcf809761" containerID="bf0a98d09807405cb7c48564281efb8750b3c036c955fab8ced2931f9c0d695b" exitCode=0 Feb 24 03:50:57 crc kubenswrapper[4923]: I0224 03:50:57.465807 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761","Type":"ContainerDied","Data":"bf0a98d09807405cb7c48564281efb8750b3c036c955fab8ced2931f9c0d695b"} Feb 24 03:50:58 crc kubenswrapper[4923]: I0224 03:50:58.976503 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.125719 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdpjk\" (UniqueName: \"kubernetes.io/projected/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-kube-api-access-wdpjk\") pod \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.125828 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-test-operator-ephemeral-temporary\") pod \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.125847 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.125875 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-ca-certs\") pod \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.125927 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-openstack-config-secret\") pod \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.125994 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-openstack-config\") pod \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.126048 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-config-data\") pod \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.126096 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-ssh-key\") pod \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.126120 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-test-operator-ephemeral-workdir\") pod \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\" (UID: \"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761\") " Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.126669 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "9b6f2b0b-f8d2-4a36-a1a2-177dcf809761" (UID: "9b6f2b0b-f8d2-4a36-a1a2-177dcf809761"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.126858 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-config-data" (OuterVolumeSpecName: "config-data") pod "9b6f2b0b-f8d2-4a36-a1a2-177dcf809761" (UID: "9b6f2b0b-f8d2-4a36-a1a2-177dcf809761"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.130270 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "9b6f2b0b-f8d2-4a36-a1a2-177dcf809761" (UID: "9b6f2b0b-f8d2-4a36-a1a2-177dcf809761"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.131749 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "test-operator-logs") pod "9b6f2b0b-f8d2-4a36-a1a2-177dcf809761" (UID: "9b6f2b0b-f8d2-4a36-a1a2-177dcf809761"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.132199 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-kube-api-access-wdpjk" (OuterVolumeSpecName: "kube-api-access-wdpjk") pod "9b6f2b0b-f8d2-4a36-a1a2-177dcf809761" (UID: "9b6f2b0b-f8d2-4a36-a1a2-177dcf809761"). InnerVolumeSpecName "kube-api-access-wdpjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.153146 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9b6f2b0b-f8d2-4a36-a1a2-177dcf809761" (UID: "9b6f2b0b-f8d2-4a36-a1a2-177dcf809761"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.158703 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9b6f2b0b-f8d2-4a36-a1a2-177dcf809761" (UID: "9b6f2b0b-f8d2-4a36-a1a2-177dcf809761"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.161605 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "9b6f2b0b-f8d2-4a36-a1a2-177dcf809761" (UID: "9b6f2b0b-f8d2-4a36-a1a2-177dcf809761"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.183220 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9b6f2b0b-f8d2-4a36-a1a2-177dcf809761" (UID: "9b6f2b0b-f8d2-4a36-a1a2-177dcf809761"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.227768 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.228000 4923 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.228096 4923 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.228156 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdpjk\" (UniqueName: \"kubernetes.io/projected/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-kube-api-access-wdpjk\") on node \"crc\" DevicePath \"\"" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.228241 4923 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.228342 4923 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.228416 4923 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.228499 4923 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.228581 4923 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9b6f2b0b-f8d2-4a36-a1a2-177dcf809761-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.245645 4923 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.330342 4923 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.494429 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9b6f2b0b-f8d2-4a36-a1a2-177dcf809761","Type":"ContainerDied","Data":"06bafcb4f1ea20da89fefd4d5340464e40dc777a10f21a9bdfd370d99f3a5f3d"} Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.494490 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06bafcb4f1ea20da89fefd4d5340464e40dc777a10f21a9bdfd370d99f3a5f3d" Feb 24 03:50:59 crc kubenswrapper[4923]: I0224 03:50:59.494499 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.740676 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9dwkp"] Feb 24 03:51:01 crc kubenswrapper[4923]: E0224 03:51:01.742827 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="863c713d-6d19-468a-b762-e7b847400478" containerName="registry-server" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.742934 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="863c713d-6d19-468a-b762-e7b847400478" containerName="registry-server" Feb 24 03:51:01 crc kubenswrapper[4923]: E0224 03:51:01.743029 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24724fb8-9a44-4400-90c9-7e8999165fb3" containerName="extract-content" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.743112 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="24724fb8-9a44-4400-90c9-7e8999165fb3" containerName="extract-content" Feb 24 03:51:01 crc kubenswrapper[4923]: E0224 03:51:01.743203 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="863c713d-6d19-468a-b762-e7b847400478" containerName="extract-utilities" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.743281 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="863c713d-6d19-468a-b762-e7b847400478" containerName="extract-utilities" Feb 24 03:51:01 crc kubenswrapper[4923]: E0224 03:51:01.743395 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24724fb8-9a44-4400-90c9-7e8999165fb3" containerName="registry-server" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.743474 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="24724fb8-9a44-4400-90c9-7e8999165fb3" containerName="registry-server" Feb 24 03:51:01 crc kubenswrapper[4923]: E0224 03:51:01.743571 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6f2b0b-f8d2-4a36-a1a2-177dcf809761" containerName="tempest-tests-tempest-tests-runner" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.743650 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6f2b0b-f8d2-4a36-a1a2-177dcf809761" containerName="tempest-tests-tempest-tests-runner" Feb 24 03:51:01 crc kubenswrapper[4923]: E0224 03:51:01.743831 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24724fb8-9a44-4400-90c9-7e8999165fb3" containerName="extract-utilities" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.743912 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="24724fb8-9a44-4400-90c9-7e8999165fb3" containerName="extract-utilities" Feb 24 03:51:01 crc kubenswrapper[4923]: E0224 03:51:01.743992 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="863c713d-6d19-468a-b762-e7b847400478" containerName="extract-content" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.744063 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="863c713d-6d19-468a-b762-e7b847400478" containerName="extract-content" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.744384 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="863c713d-6d19-468a-b762-e7b847400478" containerName="registry-server" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.744514 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6f2b0b-f8d2-4a36-a1a2-177dcf809761" containerName="tempest-tests-tempest-tests-runner" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.744610 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="24724fb8-9a44-4400-90c9-7e8999165fb3" containerName="registry-server" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.746402 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9dwkp"] Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.746645 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dwkp" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.881758 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e2a09cc-09f0-4b9d-8c25-5de1abec9607-catalog-content\") pod \"community-operators-9dwkp\" (UID: \"4e2a09cc-09f0-4b9d-8c25-5de1abec9607\") " pod="openshift-marketplace/community-operators-9dwkp" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.882408 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e2a09cc-09f0-4b9d-8c25-5de1abec9607-utilities\") pod \"community-operators-9dwkp\" (UID: \"4e2a09cc-09f0-4b9d-8c25-5de1abec9607\") " pod="openshift-marketplace/community-operators-9dwkp" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.882655 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25dfd\" (UniqueName: \"kubernetes.io/projected/4e2a09cc-09f0-4b9d-8c25-5de1abec9607-kube-api-access-25dfd\") pod \"community-operators-9dwkp\" (UID: \"4e2a09cc-09f0-4b9d-8c25-5de1abec9607\") " pod="openshift-marketplace/community-operators-9dwkp" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.984094 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e2a09cc-09f0-4b9d-8c25-5de1abec9607-catalog-content\") pod \"community-operators-9dwkp\" (UID: \"4e2a09cc-09f0-4b9d-8c25-5de1abec9607\") " pod="openshift-marketplace/community-operators-9dwkp" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.984188 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e2a09cc-09f0-4b9d-8c25-5de1abec9607-utilities\") pod \"community-operators-9dwkp\" (UID: \"4e2a09cc-09f0-4b9d-8c25-5de1abec9607\") " pod="openshift-marketplace/community-operators-9dwkp" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.984315 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25dfd\" (UniqueName: \"kubernetes.io/projected/4e2a09cc-09f0-4b9d-8c25-5de1abec9607-kube-api-access-25dfd\") pod \"community-operators-9dwkp\" (UID: \"4e2a09cc-09f0-4b9d-8c25-5de1abec9607\") " pod="openshift-marketplace/community-operators-9dwkp" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.984964 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e2a09cc-09f0-4b9d-8c25-5de1abec9607-utilities\") pod \"community-operators-9dwkp\" (UID: \"4e2a09cc-09f0-4b9d-8c25-5de1abec9607\") " pod="openshift-marketplace/community-operators-9dwkp" Feb 24 03:51:01 crc kubenswrapper[4923]: I0224 03:51:01.985273 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e2a09cc-09f0-4b9d-8c25-5de1abec9607-catalog-content\") pod \"community-operators-9dwkp\" (UID: \"4e2a09cc-09f0-4b9d-8c25-5de1abec9607\") " pod="openshift-marketplace/community-operators-9dwkp" Feb 24 03:51:02 crc kubenswrapper[4923]: I0224 03:51:02.004394 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25dfd\" (UniqueName: \"kubernetes.io/projected/4e2a09cc-09f0-4b9d-8c25-5de1abec9607-kube-api-access-25dfd\") pod \"community-operators-9dwkp\" (UID: \"4e2a09cc-09f0-4b9d-8c25-5de1abec9607\") " pod="openshift-marketplace/community-operators-9dwkp" Feb 24 03:51:02 crc kubenswrapper[4923]: I0224 03:51:02.079568 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dwkp" Feb 24 03:51:02 crc kubenswrapper[4923]: I0224 03:51:02.634320 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9dwkp"] Feb 24 03:51:03 crc kubenswrapper[4923]: I0224 03:51:03.541357 4923 generic.go:334] "Generic (PLEG): container finished" podID="4e2a09cc-09f0-4b9d-8c25-5de1abec9607" containerID="a556c89def9637f1167713ec3e304fe89dd70fac2a6bdb7d93cbc78ec804844b" exitCode=0 Feb 24 03:51:03 crc kubenswrapper[4923]: I0224 03:51:03.541486 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dwkp" event={"ID":"4e2a09cc-09f0-4b9d-8c25-5de1abec9607","Type":"ContainerDied","Data":"a556c89def9637f1167713ec3e304fe89dd70fac2a6bdb7d93cbc78ec804844b"} Feb 24 03:51:03 crc kubenswrapper[4923]: I0224 03:51:03.541750 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dwkp" event={"ID":"4e2a09cc-09f0-4b9d-8c25-5de1abec9607","Type":"ContainerStarted","Data":"39c9b9e351b639c9bdd86f9db005e76818a2bcc43c8a2113ffda09e0af256de8"} Feb 24 03:51:03 crc kubenswrapper[4923]: I0224 03:51:03.713219 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:51:03 crc kubenswrapper[4923]: E0224 03:51:03.714125 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:51:04 crc kubenswrapper[4923]: I0224 03:51:04.549782 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dwkp" event={"ID":"4e2a09cc-09f0-4b9d-8c25-5de1abec9607","Type":"ContainerStarted","Data":"e88786dcadd62521a93e23f2eed0f352c140b9dde3909c720cd0e8f17be47f2a"} Feb 24 03:51:05 crc kubenswrapper[4923]: I0224 03:51:05.563535 4923 generic.go:334] "Generic (PLEG): container finished" podID="4e2a09cc-09f0-4b9d-8c25-5de1abec9607" containerID="e88786dcadd62521a93e23f2eed0f352c140b9dde3909c720cd0e8f17be47f2a" exitCode=0 Feb 24 03:51:05 crc kubenswrapper[4923]: I0224 03:51:05.563596 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dwkp" event={"ID":"4e2a09cc-09f0-4b9d-8c25-5de1abec9607","Type":"ContainerDied","Data":"e88786dcadd62521a93e23f2eed0f352c140b9dde3909c720cd0e8f17be47f2a"} Feb 24 03:51:06 crc kubenswrapper[4923]: I0224 03:51:06.387931 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 24 03:51:06 crc kubenswrapper[4923]: I0224 03:51:06.389518 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 24 03:51:06 crc kubenswrapper[4923]: I0224 03:51:06.391645 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-nvb7w" Feb 24 03:51:06 crc kubenswrapper[4923]: I0224 03:51:06.401746 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 24 03:51:06 crc kubenswrapper[4923]: I0224 03:51:06.466811 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkhlq\" (UniqueName: \"kubernetes.io/projected/ff180506-c96d-4b80-8568-e972c702ff06-kube-api-access-wkhlq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ff180506-c96d-4b80-8568-e972c702ff06\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 24 03:51:06 crc kubenswrapper[4923]: I0224 03:51:06.466919 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ff180506-c96d-4b80-8568-e972c702ff06\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 24 03:51:06 crc kubenswrapper[4923]: I0224 03:51:06.569170 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkhlq\" (UniqueName: \"kubernetes.io/projected/ff180506-c96d-4b80-8568-e972c702ff06-kube-api-access-wkhlq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ff180506-c96d-4b80-8568-e972c702ff06\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 24 03:51:06 crc kubenswrapper[4923]: I0224 03:51:06.569245 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ff180506-c96d-4b80-8568-e972c702ff06\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 24 03:51:06 crc kubenswrapper[4923]: I0224 03:51:06.569778 4923 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ff180506-c96d-4b80-8568-e972c702ff06\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 24 03:51:06 crc kubenswrapper[4923]: I0224 03:51:06.576063 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dwkp" event={"ID":"4e2a09cc-09f0-4b9d-8c25-5de1abec9607","Type":"ContainerStarted","Data":"c96374c0efb10e284b6571cf620e61f89b9f33573841d6ecb5d6e5f7b29c9d6a"} Feb 24 03:51:06 crc kubenswrapper[4923]: I0224 03:51:06.598069 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkhlq\" (UniqueName: \"kubernetes.io/projected/ff180506-c96d-4b80-8568-e972c702ff06-kube-api-access-wkhlq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ff180506-c96d-4b80-8568-e972c702ff06\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 24 03:51:06 crc kubenswrapper[4923]: I0224 03:51:06.609727 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9dwkp" podStartSLOduration=3.175986159 podStartE2EDuration="5.609708379s" podCreationTimestamp="2026-02-24 03:51:01 +0000 UTC" firstStartedPulling="2026-02-24 03:51:03.543221316 +0000 UTC m=+3387.560292139" lastFinishedPulling="2026-02-24 03:51:05.976943546 +0000 UTC m=+3389.994014359" observedRunningTime="2026-02-24 03:51:06.604779981 +0000 UTC m=+3390.621850794" watchObservedRunningTime="2026-02-24 03:51:06.609708379 +0000 UTC m=+3390.626779192" Feb 24 03:51:06 crc kubenswrapper[4923]: I0224 03:51:06.610489 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ff180506-c96d-4b80-8568-e972c702ff06\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 24 03:51:06 crc kubenswrapper[4923]: I0224 03:51:06.713819 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 24 03:51:07 crc kubenswrapper[4923]: I0224 03:51:07.194064 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 24 03:51:07 crc kubenswrapper[4923]: I0224 03:51:07.588947 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"ff180506-c96d-4b80-8568-e972c702ff06","Type":"ContainerStarted","Data":"37094c6e1dfd7f69565f8f29d6394ba09fa46f8df06a89ff1135a3f2d87227a0"} Feb 24 03:51:08 crc kubenswrapper[4923]: I0224 03:51:08.604177 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"ff180506-c96d-4b80-8568-e972c702ff06","Type":"ContainerStarted","Data":"895efecd6561da4af75e8ca55e4999ee5029901841e634d0c21b439b7bf63816"} Feb 24 03:51:08 crc kubenswrapper[4923]: I0224 03:51:08.626318 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.875918295 podStartE2EDuration="2.626286255s" podCreationTimestamp="2026-02-24 03:51:06 +0000 UTC" firstStartedPulling="2026-02-24 03:51:07.191435903 +0000 UTC m=+3391.208506726" lastFinishedPulling="2026-02-24 03:51:07.941803873 +0000 UTC m=+3391.958874686" observedRunningTime="2026-02-24 03:51:08.619567 +0000 UTC m=+3392.636637813" watchObservedRunningTime="2026-02-24 03:51:08.626286255 +0000 UTC m=+3392.643357068" Feb 24 03:51:12 crc kubenswrapper[4923]: I0224 03:51:12.079993 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9dwkp" Feb 24 03:51:12 crc kubenswrapper[4923]: I0224 03:51:12.081454 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9dwkp" Feb 24 03:51:12 crc kubenswrapper[4923]: I0224 03:51:12.151544 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9dwkp" Feb 24 03:51:12 crc kubenswrapper[4923]: I0224 03:51:12.700417 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9dwkp" Feb 24 03:51:12 crc kubenswrapper[4923]: I0224 03:51:12.760470 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9dwkp"] Feb 24 03:51:14 crc kubenswrapper[4923]: I0224 03:51:14.665204 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9dwkp" podUID="4e2a09cc-09f0-4b9d-8c25-5de1abec9607" containerName="registry-server" containerID="cri-o://c96374c0efb10e284b6571cf620e61f89b9f33573841d6ecb5d6e5f7b29c9d6a" gracePeriod=2 Feb 24 03:51:15 crc kubenswrapper[4923]: I0224 03:51:15.678631 4923 generic.go:334] "Generic (PLEG): container finished" podID="4e2a09cc-09f0-4b9d-8c25-5de1abec9607" containerID="c96374c0efb10e284b6571cf620e61f89b9f33573841d6ecb5d6e5f7b29c9d6a" exitCode=0 Feb 24 03:51:15 crc kubenswrapper[4923]: I0224 03:51:15.678744 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dwkp" event={"ID":"4e2a09cc-09f0-4b9d-8c25-5de1abec9607","Type":"ContainerDied","Data":"c96374c0efb10e284b6571cf620e61f89b9f33573841d6ecb5d6e5f7b29c9d6a"} Feb 24 03:51:15 crc kubenswrapper[4923]: I0224 03:51:15.678917 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dwkp" event={"ID":"4e2a09cc-09f0-4b9d-8c25-5de1abec9607","Type":"ContainerDied","Data":"39c9b9e351b639c9bdd86f9db005e76818a2bcc43c8a2113ffda09e0af256de8"} Feb 24 03:51:15 crc kubenswrapper[4923]: I0224 03:51:15.678933 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39c9b9e351b639c9bdd86f9db005e76818a2bcc43c8a2113ffda09e0af256de8" Feb 24 03:51:15 crc kubenswrapper[4923]: I0224 03:51:15.714723 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dwkp" Feb 24 03:51:15 crc kubenswrapper[4923]: I0224 03:51:15.744820 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e2a09cc-09f0-4b9d-8c25-5de1abec9607-utilities\") pod \"4e2a09cc-09f0-4b9d-8c25-5de1abec9607\" (UID: \"4e2a09cc-09f0-4b9d-8c25-5de1abec9607\") " Feb 24 03:51:15 crc kubenswrapper[4923]: I0224 03:51:15.744866 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e2a09cc-09f0-4b9d-8c25-5de1abec9607-catalog-content\") pod \"4e2a09cc-09f0-4b9d-8c25-5de1abec9607\" (UID: \"4e2a09cc-09f0-4b9d-8c25-5de1abec9607\") " Feb 24 03:51:15 crc kubenswrapper[4923]: I0224 03:51:15.744985 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25dfd\" (UniqueName: \"kubernetes.io/projected/4e2a09cc-09f0-4b9d-8c25-5de1abec9607-kube-api-access-25dfd\") pod \"4e2a09cc-09f0-4b9d-8c25-5de1abec9607\" (UID: \"4e2a09cc-09f0-4b9d-8c25-5de1abec9607\") " Feb 24 03:51:15 crc kubenswrapper[4923]: I0224 03:51:15.745846 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e2a09cc-09f0-4b9d-8c25-5de1abec9607-utilities" (OuterVolumeSpecName: "utilities") pod "4e2a09cc-09f0-4b9d-8c25-5de1abec9607" (UID: "4e2a09cc-09f0-4b9d-8c25-5de1abec9607"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:51:15 crc kubenswrapper[4923]: I0224 03:51:15.752562 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e2a09cc-09f0-4b9d-8c25-5de1abec9607-kube-api-access-25dfd" (OuterVolumeSpecName: "kube-api-access-25dfd") pod "4e2a09cc-09f0-4b9d-8c25-5de1abec9607" (UID: "4e2a09cc-09f0-4b9d-8c25-5de1abec9607"). InnerVolumeSpecName "kube-api-access-25dfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:51:15 crc kubenswrapper[4923]: I0224 03:51:15.800971 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e2a09cc-09f0-4b9d-8c25-5de1abec9607-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e2a09cc-09f0-4b9d-8c25-5de1abec9607" (UID: "4e2a09cc-09f0-4b9d-8c25-5de1abec9607"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:51:15 crc kubenswrapper[4923]: I0224 03:51:15.847050 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e2a09cc-09f0-4b9d-8c25-5de1abec9607-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 03:51:15 crc kubenswrapper[4923]: I0224 03:51:15.847085 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e2a09cc-09f0-4b9d-8c25-5de1abec9607-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 03:51:15 crc kubenswrapper[4923]: I0224 03:51:15.847102 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25dfd\" (UniqueName: \"kubernetes.io/projected/4e2a09cc-09f0-4b9d-8c25-5de1abec9607-kube-api-access-25dfd\") on node \"crc\" DevicePath \"\"" Feb 24 03:51:16 crc kubenswrapper[4923]: I0224 03:51:16.689248 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dwkp" Feb 24 03:51:16 crc kubenswrapper[4923]: I0224 03:51:16.779230 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9dwkp"] Feb 24 03:51:16 crc kubenswrapper[4923]: I0224 03:51:16.793776 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9dwkp"] Feb 24 03:51:17 crc kubenswrapper[4923]: I0224 03:51:17.721207 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:51:17 crc kubenswrapper[4923]: E0224 03:51:17.721669 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:51:17 crc kubenswrapper[4923]: I0224 03:51:17.724082 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e2a09cc-09f0-4b9d-8c25-5de1abec9607" path="/var/lib/kubelet/pods/4e2a09cc-09f0-4b9d-8c25-5de1abec9607/volumes" Feb 24 03:51:28 crc kubenswrapper[4923]: I0224 03:51:28.713397 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:51:28 crc kubenswrapper[4923]: E0224 03:51:28.714275 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:51:29 crc kubenswrapper[4923]: I0224 03:51:29.545538 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zvvp7/must-gather-blt9n"] Feb 24 03:51:29 crc kubenswrapper[4923]: E0224 03:51:29.546201 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e2a09cc-09f0-4b9d-8c25-5de1abec9607" containerName="registry-server" Feb 24 03:51:29 crc kubenswrapper[4923]: I0224 03:51:29.546215 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2a09cc-09f0-4b9d-8c25-5de1abec9607" containerName="registry-server" Feb 24 03:51:29 crc kubenswrapper[4923]: E0224 03:51:29.546246 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e2a09cc-09f0-4b9d-8c25-5de1abec9607" containerName="extract-utilities" Feb 24 03:51:29 crc kubenswrapper[4923]: I0224 03:51:29.546254 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2a09cc-09f0-4b9d-8c25-5de1abec9607" containerName="extract-utilities" Feb 24 03:51:29 crc kubenswrapper[4923]: E0224 03:51:29.546264 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e2a09cc-09f0-4b9d-8c25-5de1abec9607" containerName="extract-content" Feb 24 03:51:29 crc kubenswrapper[4923]: I0224 03:51:29.546270 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2a09cc-09f0-4b9d-8c25-5de1abec9607" containerName="extract-content" Feb 24 03:51:29 crc kubenswrapper[4923]: I0224 03:51:29.546501 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e2a09cc-09f0-4b9d-8c25-5de1abec9607" containerName="registry-server" Feb 24 03:51:29 crc kubenswrapper[4923]: I0224 03:51:29.547390 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvvp7/must-gather-blt9n" Feb 24 03:51:29 crc kubenswrapper[4923]: I0224 03:51:29.551666 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zvvp7"/"openshift-service-ca.crt" Feb 24 03:51:29 crc kubenswrapper[4923]: I0224 03:51:29.551776 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zvvp7"/"kube-root-ca.crt" Feb 24 03:51:29 crc kubenswrapper[4923]: I0224 03:51:29.581842 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zvvp7/must-gather-blt9n"] Feb 24 03:51:29 crc kubenswrapper[4923]: I0224 03:51:29.703523 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48qn6\" (UniqueName: \"kubernetes.io/projected/fa9f5cc7-6960-477e-8d16-beaa00b35881-kube-api-access-48qn6\") pod \"must-gather-blt9n\" (UID: \"fa9f5cc7-6960-477e-8d16-beaa00b35881\") " pod="openshift-must-gather-zvvp7/must-gather-blt9n" Feb 24 03:51:29 crc kubenswrapper[4923]: I0224 03:51:29.703585 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fa9f5cc7-6960-477e-8d16-beaa00b35881-must-gather-output\") pod \"must-gather-blt9n\" (UID: \"fa9f5cc7-6960-477e-8d16-beaa00b35881\") " pod="openshift-must-gather-zvvp7/must-gather-blt9n" Feb 24 03:51:29 crc kubenswrapper[4923]: I0224 03:51:29.805650 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48qn6\" (UniqueName: \"kubernetes.io/projected/fa9f5cc7-6960-477e-8d16-beaa00b35881-kube-api-access-48qn6\") pod \"must-gather-blt9n\" (UID: \"fa9f5cc7-6960-477e-8d16-beaa00b35881\") " pod="openshift-must-gather-zvvp7/must-gather-blt9n" Feb 24 03:51:29 crc kubenswrapper[4923]: I0224 03:51:29.805732 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fa9f5cc7-6960-477e-8d16-beaa00b35881-must-gather-output\") pod \"must-gather-blt9n\" (UID: \"fa9f5cc7-6960-477e-8d16-beaa00b35881\") " pod="openshift-must-gather-zvvp7/must-gather-blt9n" Feb 24 03:51:29 crc kubenswrapper[4923]: I0224 03:51:29.806195 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fa9f5cc7-6960-477e-8d16-beaa00b35881-must-gather-output\") pod \"must-gather-blt9n\" (UID: \"fa9f5cc7-6960-477e-8d16-beaa00b35881\") " pod="openshift-must-gather-zvvp7/must-gather-blt9n" Feb 24 03:51:29 crc kubenswrapper[4923]: I0224 03:51:29.845191 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48qn6\" (UniqueName: \"kubernetes.io/projected/fa9f5cc7-6960-477e-8d16-beaa00b35881-kube-api-access-48qn6\") pod \"must-gather-blt9n\" (UID: \"fa9f5cc7-6960-477e-8d16-beaa00b35881\") " pod="openshift-must-gather-zvvp7/must-gather-blt9n" Feb 24 03:51:29 crc kubenswrapper[4923]: I0224 03:51:29.873771 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvvp7/must-gather-blt9n" Feb 24 03:51:30 crc kubenswrapper[4923]: I0224 03:51:30.329285 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zvvp7/must-gather-blt9n"] Feb 24 03:51:30 crc kubenswrapper[4923]: W0224 03:51:30.339079 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa9f5cc7_6960_477e_8d16_beaa00b35881.slice/crio-324973aac53f48a80500a419fc481b0582e82b701dbd8ce4d345c0f657174bed WatchSource:0}: Error finding container 324973aac53f48a80500a419fc481b0582e82b701dbd8ce4d345c0f657174bed: Status 404 returned error can't find the container with id 324973aac53f48a80500a419fc481b0582e82b701dbd8ce4d345c0f657174bed Feb 24 03:51:30 crc kubenswrapper[4923]: I0224 03:51:30.819373 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvvp7/must-gather-blt9n" event={"ID":"fa9f5cc7-6960-477e-8d16-beaa00b35881","Type":"ContainerStarted","Data":"324973aac53f48a80500a419fc481b0582e82b701dbd8ce4d345c0f657174bed"} Feb 24 03:51:36 crc kubenswrapper[4923]: I0224 03:51:36.879263 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvvp7/must-gather-blt9n" event={"ID":"fa9f5cc7-6960-477e-8d16-beaa00b35881","Type":"ContainerStarted","Data":"856a1e819c7e9a2fd2d0bf5fe442305e16b51f46604a68dfaec4a39e21938f7e"} Feb 24 03:51:36 crc kubenswrapper[4923]: I0224 03:51:36.879790 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvvp7/must-gather-blt9n" event={"ID":"fa9f5cc7-6960-477e-8d16-beaa00b35881","Type":"ContainerStarted","Data":"91b814c87f245e94518f5c0ac2d0376d189a1f0b5fc4fa8642615018fb25d273"} Feb 24 03:51:36 crc kubenswrapper[4923]: I0224 03:51:36.902350 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zvvp7/must-gather-blt9n" podStartSLOduration=1.8595451170000001 podStartE2EDuration="7.902316873s" podCreationTimestamp="2026-02-24 03:51:29 +0000 UTC" firstStartedPulling="2026-02-24 03:51:30.34173981 +0000 UTC m=+3414.358810633" lastFinishedPulling="2026-02-24 03:51:36.384511576 +0000 UTC m=+3420.401582389" observedRunningTime="2026-02-24 03:51:36.893660298 +0000 UTC m=+3420.910731111" watchObservedRunningTime="2026-02-24 03:51:36.902316873 +0000 UTC m=+3420.919387686" Feb 24 03:51:39 crc kubenswrapper[4923]: I0224 03:51:39.712971 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:51:39 crc kubenswrapper[4923]: E0224 03:51:39.713734 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:51:40 crc kubenswrapper[4923]: I0224 03:51:40.131604 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zvvp7/crc-debug-bzjzl"] Feb 24 03:51:40 crc kubenswrapper[4923]: I0224 03:51:40.133621 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvvp7/crc-debug-bzjzl" Feb 24 03:51:40 crc kubenswrapper[4923]: I0224 03:51:40.135989 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zvvp7"/"default-dockercfg-clzv2" Feb 24 03:51:40 crc kubenswrapper[4923]: I0224 03:51:40.326742 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44d75f8e-9596-4812-bfcd-555733a90f5f-host\") pod \"crc-debug-bzjzl\" (UID: \"44d75f8e-9596-4812-bfcd-555733a90f5f\") " pod="openshift-must-gather-zvvp7/crc-debug-bzjzl" Feb 24 03:51:40 crc kubenswrapper[4923]: I0224 03:51:40.326810 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz9h8\" (UniqueName: \"kubernetes.io/projected/44d75f8e-9596-4812-bfcd-555733a90f5f-kube-api-access-rz9h8\") pod \"crc-debug-bzjzl\" (UID: \"44d75f8e-9596-4812-bfcd-555733a90f5f\") " pod="openshift-must-gather-zvvp7/crc-debug-bzjzl" Feb 24 03:51:40 crc kubenswrapper[4923]: I0224 03:51:40.430217 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz9h8\" (UniqueName: \"kubernetes.io/projected/44d75f8e-9596-4812-bfcd-555733a90f5f-kube-api-access-rz9h8\") pod \"crc-debug-bzjzl\" (UID: \"44d75f8e-9596-4812-bfcd-555733a90f5f\") " pod="openshift-must-gather-zvvp7/crc-debug-bzjzl" Feb 24 03:51:40 crc kubenswrapper[4923]: I0224 03:51:40.430532 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44d75f8e-9596-4812-bfcd-555733a90f5f-host\") pod \"crc-debug-bzjzl\" (UID: \"44d75f8e-9596-4812-bfcd-555733a90f5f\") " pod="openshift-must-gather-zvvp7/crc-debug-bzjzl" Feb 24 03:51:40 crc kubenswrapper[4923]: I0224 03:51:40.430609 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44d75f8e-9596-4812-bfcd-555733a90f5f-host\") pod \"crc-debug-bzjzl\" (UID: \"44d75f8e-9596-4812-bfcd-555733a90f5f\") " pod="openshift-must-gather-zvvp7/crc-debug-bzjzl" Feb 24 03:51:40 crc kubenswrapper[4923]: I0224 03:51:40.451437 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz9h8\" (UniqueName: \"kubernetes.io/projected/44d75f8e-9596-4812-bfcd-555733a90f5f-kube-api-access-rz9h8\") pod \"crc-debug-bzjzl\" (UID: \"44d75f8e-9596-4812-bfcd-555733a90f5f\") " pod="openshift-must-gather-zvvp7/crc-debug-bzjzl" Feb 24 03:51:40 crc kubenswrapper[4923]: I0224 03:51:40.457626 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvvp7/crc-debug-bzjzl" Feb 24 03:51:40 crc kubenswrapper[4923]: I0224 03:51:40.918020 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvvp7/crc-debug-bzjzl" event={"ID":"44d75f8e-9596-4812-bfcd-555733a90f5f","Type":"ContainerStarted","Data":"4952572d10445c0a96845c05d39041d6ae360de17322d7368cc0d522f698896f"} Feb 24 03:51:51 crc kubenswrapper[4923]: I0224 03:51:51.021132 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvvp7/crc-debug-bzjzl" event={"ID":"44d75f8e-9596-4812-bfcd-555733a90f5f","Type":"ContainerStarted","Data":"78f4a8c69e7150eeba2a4f6846788bfe79e7ec5c8f4b66ac838547a014b40e0b"} Feb 24 03:51:51 crc kubenswrapper[4923]: I0224 03:51:51.041393 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zvvp7/crc-debug-bzjzl" podStartSLOduration=0.85852433 podStartE2EDuration="11.041374114s" podCreationTimestamp="2026-02-24 03:51:40 +0000 UTC" firstStartedPulling="2026-02-24 03:51:40.507110009 +0000 UTC m=+3424.524180822" lastFinishedPulling="2026-02-24 03:51:50.689959793 +0000 UTC m=+3434.707030606" observedRunningTime="2026-02-24 03:51:51.038423517 +0000 UTC m=+3435.055494330" watchObservedRunningTime="2026-02-24 03:51:51.041374114 +0000 UTC m=+3435.058444927" Feb 24 03:51:52 crc kubenswrapper[4923]: I0224 03:51:52.713631 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:51:52 crc kubenswrapper[4923]: E0224 03:51:52.714613 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:52:05 crc kubenswrapper[4923]: I0224 03:52:05.712934 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:52:05 crc kubenswrapper[4923]: E0224 03:52:05.713737 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:52:19 crc kubenswrapper[4923]: I0224 03:52:19.713440 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:52:19 crc kubenswrapper[4923]: E0224 03:52:19.714109 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:52:27 crc kubenswrapper[4923]: I0224 03:52:27.324041 4923 generic.go:334] "Generic (PLEG): container finished" podID="44d75f8e-9596-4812-bfcd-555733a90f5f" containerID="78f4a8c69e7150eeba2a4f6846788bfe79e7ec5c8f4b66ac838547a014b40e0b" exitCode=0 Feb 24 03:52:27 crc kubenswrapper[4923]: I0224 03:52:27.324129 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvvp7/crc-debug-bzjzl" event={"ID":"44d75f8e-9596-4812-bfcd-555733a90f5f","Type":"ContainerDied","Data":"78f4a8c69e7150eeba2a4f6846788bfe79e7ec5c8f4b66ac838547a014b40e0b"} Feb 24 03:52:28 crc kubenswrapper[4923]: I0224 03:52:28.449627 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvvp7/crc-debug-bzjzl" Feb 24 03:52:28 crc kubenswrapper[4923]: I0224 03:52:28.454088 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44d75f8e-9596-4812-bfcd-555733a90f5f-host\") pod \"44d75f8e-9596-4812-bfcd-555733a90f5f\" (UID: \"44d75f8e-9596-4812-bfcd-555733a90f5f\") " Feb 24 03:52:28 crc kubenswrapper[4923]: I0224 03:52:28.454470 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d75f8e-9596-4812-bfcd-555733a90f5f-host" (OuterVolumeSpecName: "host") pod "44d75f8e-9596-4812-bfcd-555733a90f5f" (UID: "44d75f8e-9596-4812-bfcd-555733a90f5f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:52:28 crc kubenswrapper[4923]: I0224 03:52:28.485173 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zvvp7/crc-debug-bzjzl"] Feb 24 03:52:28 crc kubenswrapper[4923]: I0224 03:52:28.494268 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zvvp7/crc-debug-bzjzl"] Feb 24 03:52:28 crc kubenswrapper[4923]: I0224 03:52:28.555257 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz9h8\" (UniqueName: \"kubernetes.io/projected/44d75f8e-9596-4812-bfcd-555733a90f5f-kube-api-access-rz9h8\") pod \"44d75f8e-9596-4812-bfcd-555733a90f5f\" (UID: \"44d75f8e-9596-4812-bfcd-555733a90f5f\") " Feb 24 03:52:28 crc kubenswrapper[4923]: I0224 03:52:28.555886 4923 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44d75f8e-9596-4812-bfcd-555733a90f5f-host\") on node \"crc\" DevicePath \"\"" Feb 24 03:52:28 crc kubenswrapper[4923]: I0224 03:52:28.562868 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d75f8e-9596-4812-bfcd-555733a90f5f-kube-api-access-rz9h8" (OuterVolumeSpecName: "kube-api-access-rz9h8") pod "44d75f8e-9596-4812-bfcd-555733a90f5f" (UID: "44d75f8e-9596-4812-bfcd-555733a90f5f"). InnerVolumeSpecName "kube-api-access-rz9h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:52:28 crc kubenswrapper[4923]: I0224 03:52:28.657745 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz9h8\" (UniqueName: \"kubernetes.io/projected/44d75f8e-9596-4812-bfcd-555733a90f5f-kube-api-access-rz9h8\") on node \"crc\" DevicePath \"\"" Feb 24 03:52:29 crc kubenswrapper[4923]: I0224 03:52:29.346182 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4952572d10445c0a96845c05d39041d6ae360de17322d7368cc0d522f698896f" Feb 24 03:52:29 crc kubenswrapper[4923]: I0224 03:52:29.346244 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvvp7/crc-debug-bzjzl" Feb 24 03:52:29 crc kubenswrapper[4923]: I0224 03:52:29.665513 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zvvp7/crc-debug-k2x6r"] Feb 24 03:52:29 crc kubenswrapper[4923]: E0224 03:52:29.665871 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d75f8e-9596-4812-bfcd-555733a90f5f" containerName="container-00" Feb 24 03:52:29 crc kubenswrapper[4923]: I0224 03:52:29.665883 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d75f8e-9596-4812-bfcd-555733a90f5f" containerName="container-00" Feb 24 03:52:29 crc kubenswrapper[4923]: I0224 03:52:29.666103 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d75f8e-9596-4812-bfcd-555733a90f5f" containerName="container-00" Feb 24 03:52:29 crc kubenswrapper[4923]: I0224 03:52:29.666823 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvvp7/crc-debug-k2x6r" Feb 24 03:52:29 crc kubenswrapper[4923]: I0224 03:52:29.668738 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zvvp7"/"default-dockercfg-clzv2" Feb 24 03:52:29 crc kubenswrapper[4923]: I0224 03:52:29.725932 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44d75f8e-9596-4812-bfcd-555733a90f5f" path="/var/lib/kubelet/pods/44d75f8e-9596-4812-bfcd-555733a90f5f/volumes" Feb 24 03:52:29 crc kubenswrapper[4923]: I0224 03:52:29.779103 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/620f6339-6b4d-4812-816e-d87cae04753f-host\") pod \"crc-debug-k2x6r\" (UID: \"620f6339-6b4d-4812-816e-d87cae04753f\") " pod="openshift-must-gather-zvvp7/crc-debug-k2x6r" Feb 24 03:52:29 crc kubenswrapper[4923]: I0224 03:52:29.779509 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rznpw\" (UniqueName: \"kubernetes.io/projected/620f6339-6b4d-4812-816e-d87cae04753f-kube-api-access-rznpw\") pod \"crc-debug-k2x6r\" (UID: \"620f6339-6b4d-4812-816e-d87cae04753f\") " pod="openshift-must-gather-zvvp7/crc-debug-k2x6r" Feb 24 03:52:29 crc kubenswrapper[4923]: I0224 03:52:29.880881 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/620f6339-6b4d-4812-816e-d87cae04753f-host\") pod \"crc-debug-k2x6r\" (UID: \"620f6339-6b4d-4812-816e-d87cae04753f\") " pod="openshift-must-gather-zvvp7/crc-debug-k2x6r" Feb 24 03:52:29 crc kubenswrapper[4923]: I0224 03:52:29.880985 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rznpw\" (UniqueName: \"kubernetes.io/projected/620f6339-6b4d-4812-816e-d87cae04753f-kube-api-access-rznpw\") pod \"crc-debug-k2x6r\" (UID: \"620f6339-6b4d-4812-816e-d87cae04753f\") " pod="openshift-must-gather-zvvp7/crc-debug-k2x6r" Feb 24 03:52:29 crc kubenswrapper[4923]: I0224 03:52:29.881061 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/620f6339-6b4d-4812-816e-d87cae04753f-host\") pod \"crc-debug-k2x6r\" (UID: \"620f6339-6b4d-4812-816e-d87cae04753f\") " pod="openshift-must-gather-zvvp7/crc-debug-k2x6r" Feb 24 03:52:29 crc kubenswrapper[4923]: I0224 03:52:29.897651 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rznpw\" (UniqueName: \"kubernetes.io/projected/620f6339-6b4d-4812-816e-d87cae04753f-kube-api-access-rznpw\") pod \"crc-debug-k2x6r\" (UID: \"620f6339-6b4d-4812-816e-d87cae04753f\") " pod="openshift-must-gather-zvvp7/crc-debug-k2x6r" Feb 24 03:52:29 crc kubenswrapper[4923]: I0224 03:52:29.984812 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvvp7/crc-debug-k2x6r" Feb 24 03:52:30 crc kubenswrapper[4923]: I0224 03:52:30.356537 4923 generic.go:334] "Generic (PLEG): container finished" podID="620f6339-6b4d-4812-816e-d87cae04753f" containerID="ab522e65bbc5740a6f6402e5ed4d3a0383a1f5c171dd9641dfb86e770c80a3e8" exitCode=0 Feb 24 03:52:30 crc kubenswrapper[4923]: I0224 03:52:30.356689 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvvp7/crc-debug-k2x6r" event={"ID":"620f6339-6b4d-4812-816e-d87cae04753f","Type":"ContainerDied","Data":"ab522e65bbc5740a6f6402e5ed4d3a0383a1f5c171dd9641dfb86e770c80a3e8"} Feb 24 03:52:30 crc kubenswrapper[4923]: I0224 03:52:30.356935 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvvp7/crc-debug-k2x6r" event={"ID":"620f6339-6b4d-4812-816e-d87cae04753f","Type":"ContainerStarted","Data":"98c5e91c235c9d1916142cce91ea360b5b737c2f359dc8a5ec4e3703cbe99903"} Feb 24 03:52:30 crc kubenswrapper[4923]: I0224 03:52:30.722393 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zvvp7/crc-debug-k2x6r"] Feb 24 03:52:30 crc kubenswrapper[4923]: I0224 03:52:30.733070 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zvvp7/crc-debug-k2x6r"] Feb 24 03:52:31 crc kubenswrapper[4923]: I0224 03:52:31.455624 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvvp7/crc-debug-k2x6r" Feb 24 03:52:31 crc kubenswrapper[4923]: I0224 03:52:31.609758 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/620f6339-6b4d-4812-816e-d87cae04753f-host\") pod \"620f6339-6b4d-4812-816e-d87cae04753f\" (UID: \"620f6339-6b4d-4812-816e-d87cae04753f\") " Feb 24 03:52:31 crc kubenswrapper[4923]: I0224 03:52:31.609819 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rznpw\" (UniqueName: \"kubernetes.io/projected/620f6339-6b4d-4812-816e-d87cae04753f-kube-api-access-rznpw\") pod \"620f6339-6b4d-4812-816e-d87cae04753f\" (UID: \"620f6339-6b4d-4812-816e-d87cae04753f\") " Feb 24 03:52:31 crc kubenswrapper[4923]: I0224 03:52:31.610104 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/620f6339-6b4d-4812-816e-d87cae04753f-host" (OuterVolumeSpecName: "host") pod "620f6339-6b4d-4812-816e-d87cae04753f" (UID: "620f6339-6b4d-4812-816e-d87cae04753f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:52:31 crc kubenswrapper[4923]: I0224 03:52:31.610197 4923 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/620f6339-6b4d-4812-816e-d87cae04753f-host\") on node \"crc\" DevicePath \"\"" Feb 24 03:52:31 crc kubenswrapper[4923]: I0224 03:52:31.627503 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/620f6339-6b4d-4812-816e-d87cae04753f-kube-api-access-rznpw" (OuterVolumeSpecName: "kube-api-access-rznpw") pod "620f6339-6b4d-4812-816e-d87cae04753f" (UID: "620f6339-6b4d-4812-816e-d87cae04753f"). InnerVolumeSpecName "kube-api-access-rznpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:52:31 crc kubenswrapper[4923]: I0224 03:52:31.711718 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rznpw\" (UniqueName: \"kubernetes.io/projected/620f6339-6b4d-4812-816e-d87cae04753f-kube-api-access-rznpw\") on node \"crc\" DevicePath \"\"" Feb 24 03:52:31 crc kubenswrapper[4923]: I0224 03:52:31.712813 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:52:31 crc kubenswrapper[4923]: E0224 03:52:31.713048 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:52:31 crc kubenswrapper[4923]: I0224 03:52:31.724906 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="620f6339-6b4d-4812-816e-d87cae04753f" path="/var/lib/kubelet/pods/620f6339-6b4d-4812-816e-d87cae04753f/volumes" Feb 24 03:52:31 crc kubenswrapper[4923]: I0224 03:52:31.931035 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zvvp7/crc-debug-8ncsl"] Feb 24 03:52:31 crc kubenswrapper[4923]: E0224 03:52:31.931508 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="620f6339-6b4d-4812-816e-d87cae04753f" containerName="container-00" Feb 24 03:52:31 crc kubenswrapper[4923]: I0224 03:52:31.931531 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="620f6339-6b4d-4812-816e-d87cae04753f" containerName="container-00" Feb 24 03:52:31 crc kubenswrapper[4923]: I0224 03:52:31.931769 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="620f6339-6b4d-4812-816e-d87cae04753f" containerName="container-00" Feb 24 03:52:31 crc kubenswrapper[4923]: I0224 03:52:31.932516 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvvp7/crc-debug-8ncsl" Feb 24 03:52:32 crc kubenswrapper[4923]: I0224 03:52:32.018361 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ec11254-9351-4e31-8176-8491c229c0c4-host\") pod \"crc-debug-8ncsl\" (UID: \"5ec11254-9351-4e31-8176-8491c229c0c4\") " pod="openshift-must-gather-zvvp7/crc-debug-8ncsl" Feb 24 03:52:32 crc kubenswrapper[4923]: I0224 03:52:32.018535 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srh2j\" (UniqueName: \"kubernetes.io/projected/5ec11254-9351-4e31-8176-8491c229c0c4-kube-api-access-srh2j\") pod \"crc-debug-8ncsl\" (UID: \"5ec11254-9351-4e31-8176-8491c229c0c4\") " pod="openshift-must-gather-zvvp7/crc-debug-8ncsl" Feb 24 03:52:32 crc kubenswrapper[4923]: I0224 03:52:32.120193 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ec11254-9351-4e31-8176-8491c229c0c4-host\") pod \"crc-debug-8ncsl\" (UID: \"5ec11254-9351-4e31-8176-8491c229c0c4\") " pod="openshift-must-gather-zvvp7/crc-debug-8ncsl" Feb 24 03:52:32 crc kubenswrapper[4923]: I0224 03:52:32.120333 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srh2j\" (UniqueName: \"kubernetes.io/projected/5ec11254-9351-4e31-8176-8491c229c0c4-kube-api-access-srh2j\") pod \"crc-debug-8ncsl\" (UID: \"5ec11254-9351-4e31-8176-8491c229c0c4\") " pod="openshift-must-gather-zvvp7/crc-debug-8ncsl" Feb 24 03:52:32 crc kubenswrapper[4923]: I0224 03:52:32.120333 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ec11254-9351-4e31-8176-8491c229c0c4-host\") pod \"crc-debug-8ncsl\" (UID: \"5ec11254-9351-4e31-8176-8491c229c0c4\") " pod="openshift-must-gather-zvvp7/crc-debug-8ncsl" Feb 24 03:52:32 crc kubenswrapper[4923]: I0224 03:52:32.149643 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srh2j\" (UniqueName: \"kubernetes.io/projected/5ec11254-9351-4e31-8176-8491c229c0c4-kube-api-access-srh2j\") pod \"crc-debug-8ncsl\" (UID: \"5ec11254-9351-4e31-8176-8491c229c0c4\") " pod="openshift-must-gather-zvvp7/crc-debug-8ncsl" Feb 24 03:52:32 crc kubenswrapper[4923]: I0224 03:52:32.257695 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvvp7/crc-debug-8ncsl" Feb 24 03:52:32 crc kubenswrapper[4923]: W0224 03:52:32.284200 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ec11254_9351_4e31_8176_8491c229c0c4.slice/crio-1a1bf138514ad4abc843b3a8c319dc250da03d55f5cc8487464976545e8a32d2 WatchSource:0}: Error finding container 1a1bf138514ad4abc843b3a8c319dc250da03d55f5cc8487464976545e8a32d2: Status 404 returned error can't find the container with id 1a1bf138514ad4abc843b3a8c319dc250da03d55f5cc8487464976545e8a32d2 Feb 24 03:52:32 crc kubenswrapper[4923]: I0224 03:52:32.386174 4923 scope.go:117] "RemoveContainer" containerID="ab522e65bbc5740a6f6402e5ed4d3a0383a1f5c171dd9641dfb86e770c80a3e8" Feb 24 03:52:32 crc kubenswrapper[4923]: I0224 03:52:32.386220 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvvp7/crc-debug-k2x6r" Feb 24 03:52:32 crc kubenswrapper[4923]: I0224 03:52:32.389720 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvvp7/crc-debug-8ncsl" event={"ID":"5ec11254-9351-4e31-8176-8491c229c0c4","Type":"ContainerStarted","Data":"1a1bf138514ad4abc843b3a8c319dc250da03d55f5cc8487464976545e8a32d2"} Feb 24 03:52:33 crc kubenswrapper[4923]: I0224 03:52:33.409696 4923 generic.go:334] "Generic (PLEG): container finished" podID="5ec11254-9351-4e31-8176-8491c229c0c4" containerID="ed2792a306e269c228ff4328d963cbe3388911cd399e8f2172c01eaca0fe1ef7" exitCode=0 Feb 24 03:52:33 crc kubenswrapper[4923]: I0224 03:52:33.409775 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvvp7/crc-debug-8ncsl" event={"ID":"5ec11254-9351-4e31-8176-8491c229c0c4","Type":"ContainerDied","Data":"ed2792a306e269c228ff4328d963cbe3388911cd399e8f2172c01eaca0fe1ef7"} Feb 24 03:52:33 crc kubenswrapper[4923]: I0224 03:52:33.455271 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zvvp7/crc-debug-8ncsl"] Feb 24 03:52:33 crc kubenswrapper[4923]: I0224 03:52:33.467503 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zvvp7/crc-debug-8ncsl"] Feb 24 03:52:34 crc kubenswrapper[4923]: I0224 03:52:34.539022 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvvp7/crc-debug-8ncsl" Feb 24 03:52:34 crc kubenswrapper[4923]: I0224 03:52:34.670927 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ec11254-9351-4e31-8176-8491c229c0c4-host\") pod \"5ec11254-9351-4e31-8176-8491c229c0c4\" (UID: \"5ec11254-9351-4e31-8176-8491c229c0c4\") " Feb 24 03:52:34 crc kubenswrapper[4923]: I0224 03:52:34.671237 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srh2j\" (UniqueName: \"kubernetes.io/projected/5ec11254-9351-4e31-8176-8491c229c0c4-kube-api-access-srh2j\") pod \"5ec11254-9351-4e31-8176-8491c229c0c4\" (UID: \"5ec11254-9351-4e31-8176-8491c229c0c4\") " Feb 24 03:52:34 crc kubenswrapper[4923]: I0224 03:52:34.671078 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ec11254-9351-4e31-8176-8491c229c0c4-host" (OuterVolumeSpecName: "host") pod "5ec11254-9351-4e31-8176-8491c229c0c4" (UID: "5ec11254-9351-4e31-8176-8491c229c0c4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 03:52:34 crc kubenswrapper[4923]: I0224 03:52:34.671732 4923 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ec11254-9351-4e31-8176-8491c229c0c4-host\") on node \"crc\" DevicePath \"\"" Feb 24 03:52:34 crc kubenswrapper[4923]: I0224 03:52:34.676702 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec11254-9351-4e31-8176-8491c229c0c4-kube-api-access-srh2j" (OuterVolumeSpecName: "kube-api-access-srh2j") pod "5ec11254-9351-4e31-8176-8491c229c0c4" (UID: "5ec11254-9351-4e31-8176-8491c229c0c4"). InnerVolumeSpecName "kube-api-access-srh2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:52:34 crc kubenswrapper[4923]: I0224 03:52:34.773407 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srh2j\" (UniqueName: \"kubernetes.io/projected/5ec11254-9351-4e31-8176-8491c229c0c4-kube-api-access-srh2j\") on node \"crc\" DevicePath \"\"" Feb 24 03:52:35 crc kubenswrapper[4923]: I0224 03:52:35.492279 4923 scope.go:117] "RemoveContainer" containerID="ed2792a306e269c228ff4328d963cbe3388911cd399e8f2172c01eaca0fe1ef7" Feb 24 03:52:35 crc kubenswrapper[4923]: I0224 03:52:35.492358 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvvp7/crc-debug-8ncsl" Feb 24 03:52:35 crc kubenswrapper[4923]: I0224 03:52:35.723227 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec11254-9351-4e31-8176-8491c229c0c4" path="/var/lib/kubelet/pods/5ec11254-9351-4e31-8176-8491c229c0c4/volumes" Feb 24 03:52:46 crc kubenswrapper[4923]: I0224 03:52:46.713879 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:52:46 crc kubenswrapper[4923]: E0224 03:52:46.715288 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:52:49 crc kubenswrapper[4923]: I0224 03:52:49.044199 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7d7c54dbbb-xcg2j_6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed/barbican-api/0.log" Feb 24 03:52:49 crc kubenswrapper[4923]: I0224 03:52:49.162550 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7d7c54dbbb-xcg2j_6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed/barbican-api-log/0.log" Feb 24 03:52:49 crc kubenswrapper[4923]: I0224 03:52:49.198620 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-895b8674b-v44h4_774cca46-21ee-41c1-81e7-00c89c26ad37/barbican-keystone-listener/0.log" Feb 24 03:52:49 crc kubenswrapper[4923]: I0224 03:52:49.318634 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-895b8674b-v44h4_774cca46-21ee-41c1-81e7-00c89c26ad37/barbican-keystone-listener-log/0.log" Feb 24 03:52:49 crc kubenswrapper[4923]: I0224 03:52:49.397736 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c8bfc6649-mz55h_4cafcd89-7a31-47f2-980b-9b9a6a21bd49/barbican-worker/0.log" Feb 24 03:52:49 crc kubenswrapper[4923]: I0224 03:52:49.406761 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c8bfc6649-mz55h_4cafcd89-7a31-47f2-980b-9b9a6a21bd49/barbican-worker-log/0.log" Feb 24 03:52:49 crc kubenswrapper[4923]: I0224 03:52:49.582463 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb_dd6fe20f-e2e1-46ae-aa88-cbfd410076a2/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 03:52:49 crc kubenswrapper[4923]: I0224 03:52:49.619675 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0af0866b-f6b6-45cb-9322-25fc22f6b6b4/ceilometer-central-agent/0.log" Feb 24 03:52:49 crc kubenswrapper[4923]: I0224 03:52:49.760683 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0af0866b-f6b6-45cb-9322-25fc22f6b6b4/ceilometer-notification-agent/0.log" Feb 24 03:52:49 crc kubenswrapper[4923]: I0224 03:52:49.787289 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0af0866b-f6b6-45cb-9322-25fc22f6b6b4/sg-core/0.log" Feb 24 03:52:49 crc kubenswrapper[4923]: I0224 03:52:49.791629 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0af0866b-f6b6-45cb-9322-25fc22f6b6b4/proxy-httpd/0.log" Feb 24 03:52:49 crc kubenswrapper[4923]: I0224 03:52:49.966839 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4e3799b0-c8b2-4204-8b12-62e28dee2c09/cinder-api-log/0.log" Feb 24 03:52:50 crc kubenswrapper[4923]: I0224 03:52:50.008695 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4e3799b0-c8b2-4204-8b12-62e28dee2c09/cinder-api/0.log" Feb 24 03:52:50 crc kubenswrapper[4923]: I0224 03:52:50.064163 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf/cinder-scheduler/0.log" Feb 24 03:52:50 crc kubenswrapper[4923]: I0224 03:52:50.162962 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf/probe/0.log" Feb 24 03:52:50 crc kubenswrapper[4923]: I0224 03:52:50.232087 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2_8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 03:52:50 crc kubenswrapper[4923]: I0224 03:52:50.363416 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-smdtc_ec71f0e3-4ff0-46b6-a887-37132374b80c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 03:52:50 crc kubenswrapper[4923]: I0224 03:52:50.444497 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-4v9fw_7ad3dfbc-174b-4b0c-9d41-a0c51eead210/init/0.log" Feb 24 03:52:50 crc kubenswrapper[4923]: I0224 03:52:50.646480 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-4v9fw_7ad3dfbc-174b-4b0c-9d41-a0c51eead210/init/0.log" Feb 24 03:52:50 crc kubenswrapper[4923]: I0224 03:52:50.674815 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-4v9fw_7ad3dfbc-174b-4b0c-9d41-a0c51eead210/dnsmasq-dns/0.log" Feb 24 03:52:50 crc kubenswrapper[4923]: I0224 03:52:50.727952 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw_04b83327-1210-4a1a-b104-70fff61786bf/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 03:52:50 crc kubenswrapper[4923]: I0224 03:52:50.877092 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3b54c615-8156-4ec6-aee7-b8c9448a574e/glance-httpd/0.log" Feb 24 03:52:50 crc kubenswrapper[4923]: I0224 03:52:50.918956 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3b54c615-8156-4ec6-aee7-b8c9448a574e/glance-log/0.log" Feb 24 03:52:51 crc kubenswrapper[4923]: I0224 03:52:51.060199 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2656e2e0-085f-443d-ad1c-2243a4f92a11/glance-httpd/0.log" Feb 24 03:52:51 crc kubenswrapper[4923]: I0224 03:52:51.100599 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2656e2e0-085f-443d-ad1c-2243a4f92a11/glance-log/0.log" Feb 24 03:52:51 crc kubenswrapper[4923]: I0224 03:52:51.252051 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6dcbd8cd94-497ns_3cad919b-bb41-4c17-a13a-01831e715fd9/horizon/0.log" Feb 24 03:52:51 crc kubenswrapper[4923]: I0224 03:52:51.515868 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9_f8dab472-e2b2-4eab-8ced-7eed7b1bc842/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 03:52:51 crc kubenswrapper[4923]: I0224 03:52:51.797120 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-sg2sp_140f3efa-43c3-4d0b-a738-fc87e216c13b/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 03:52:51 crc kubenswrapper[4923]: I0224 03:52:51.854627 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6dcbd8cd94-497ns_3cad919b-bb41-4c17-a13a-01831e715fd9/horizon-log/0.log" Feb 24 03:52:51 crc kubenswrapper[4923]: I0224 03:52:51.953694 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6cfd87c4f7-w99br_26c252fe-d59a-4053-946d-b75bea1a9c0b/keystone-api/0.log" Feb 24 03:52:52 crc kubenswrapper[4923]: I0224 03:52:52.031398 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_18af060f-9e29-435c-82a9-6bdd59867a46/kube-state-metrics/0.log" Feb 24 03:52:52 crc kubenswrapper[4923]: I0224 03:52:52.177025 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn_aca2992d-fbda-4dad-8ab4-02147a40ed9e/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 03:52:52 crc kubenswrapper[4923]: I0224 03:52:52.527151 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-55b6b875d5-hmfv4_a13b787e-2ba9-4a5b-96d0-c1d044f4c958/neutron-api/0.log" Feb 24 03:52:52 crc kubenswrapper[4923]: I0224 03:52:52.575903 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-55b6b875d5-hmfv4_a13b787e-2ba9-4a5b-96d0-c1d044f4c958/neutron-httpd/0.log" Feb 24 03:52:52 crc kubenswrapper[4923]: I0224 03:52:52.789859 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d_a55af564-f005-452b-acb3-8fa3910b1485/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 03:52:53 crc kubenswrapper[4923]: I0224 03:52:53.329094 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_5c079676-c0fa-49dd-94fe-360388e5014d/nova-cell0-conductor-conductor/0.log" Feb 24 03:52:53 crc kubenswrapper[4923]: I0224 03:52:53.346336 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_633331a8-df46-4c85-b234-1e2820565794/nova-api-log/0.log" Feb 24 03:52:53 crc kubenswrapper[4923]: I0224 03:52:53.492796 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_633331a8-df46-4c85-b234-1e2820565794/nova-api-api/0.log" Feb 24 03:52:53 crc kubenswrapper[4923]: I0224 03:52:53.602309 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_5130427e-ca28-4060-ac80-72202959e07f/nova-cell1-conductor-conductor/0.log" Feb 24 03:52:53 crc kubenswrapper[4923]: I0224 03:52:53.718740 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3/nova-cell1-novncproxy-novncproxy/0.log" Feb 24 03:52:53 crc kubenswrapper[4923]: I0224 03:52:53.851036 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-rfz2h_08f21e51-2e83-4b47-b794-1e3a2358381d/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 03:52:54 crc kubenswrapper[4923]: I0224 03:52:53.993604 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9f2c858b-ff6d-44cb-9925-d4c0ef27f133/nova-metadata-log/0.log" Feb 24 03:52:54 crc kubenswrapper[4923]: I0224 03:52:54.268213 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_5a8d4b38-c639-42b0-a48a-6193bee91648/nova-scheduler-scheduler/0.log" Feb 24 03:52:54 crc kubenswrapper[4923]: I0224 03:52:54.349332 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b2879b26-9173-4d23-b6f4-9c9e4c43f08e/mysql-bootstrap/0.log" Feb 24 03:52:54 crc kubenswrapper[4923]: I0224 03:52:54.512664 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b2879b26-9173-4d23-b6f4-9c9e4c43f08e/galera/0.log" Feb 24 03:52:54 crc kubenswrapper[4923]: I0224 03:52:54.536965 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b2879b26-9173-4d23-b6f4-9c9e4c43f08e/mysql-bootstrap/0.log" Feb 24 03:52:54 crc kubenswrapper[4923]: I0224 03:52:54.758749 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_71ebe37b-5831-4545-8f6a-8db6e194982f/mysql-bootstrap/0.log" Feb 24 03:52:54 crc kubenswrapper[4923]: I0224 03:52:54.918075 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_71ebe37b-5831-4545-8f6a-8db6e194982f/mysql-bootstrap/0.log" Feb 24 03:52:55 crc kubenswrapper[4923]: I0224 03:52:55.013499 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_71ebe37b-5831-4545-8f6a-8db6e194982f/galera/0.log" Feb 24 03:52:55 crc kubenswrapper[4923]: I0224 03:52:55.097350 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9f2c858b-ff6d-44cb-9925-d4c0ef27f133/nova-metadata-metadata/0.log" Feb 24 03:52:55 crc kubenswrapper[4923]: I0224 03:52:55.106513 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3/openstackclient/0.log" Feb 24 03:52:55 crc kubenswrapper[4923]: I0224 03:52:55.241616 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6l624_6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095/ovn-controller/0.log" Feb 24 03:52:55 crc kubenswrapper[4923]: I0224 03:52:55.282918 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-pnpxx_459e20ec-ab36-4745-9a6b-8c3832560d72/openstack-network-exporter/0.log" Feb 24 03:52:55 crc kubenswrapper[4923]: I0224 03:52:55.484022 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-555wh_22ccde27-2e54-4d62-8cc4-8b12ea5e92a7/ovsdb-server-init/0.log" Feb 24 03:52:55 crc kubenswrapper[4923]: I0224 03:52:55.667941 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-555wh_22ccde27-2e54-4d62-8cc4-8b12ea5e92a7/ovsdb-server-init/0.log" Feb 24 03:52:55 crc kubenswrapper[4923]: I0224 03:52:55.725682 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-555wh_22ccde27-2e54-4d62-8cc4-8b12ea5e92a7/ovs-vswitchd/0.log" Feb 24 03:52:55 crc kubenswrapper[4923]: I0224 03:52:55.740888 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-555wh_22ccde27-2e54-4d62-8cc4-8b12ea5e92a7/ovsdb-server/0.log" Feb 24 03:52:55 crc kubenswrapper[4923]: I0224 03:52:55.886480 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-hsl52_543b3843-407e-4043-a851-4170590b5a68/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 03:52:55 crc kubenswrapper[4923]: I0224 03:52:55.945309 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e0d59d8f-593d-437e-9450-93fb5bbaa025/openstack-network-exporter/0.log" Feb 24 03:52:55 crc kubenswrapper[4923]: I0224 03:52:55.991428 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e0d59d8f-593d-437e-9450-93fb5bbaa025/ovn-northd/0.log" Feb 24 03:52:56 crc kubenswrapper[4923]: I0224 03:52:56.128667 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_34d16b71-0cf5-4143-9225-3e44441dc2da/openstack-network-exporter/0.log" Feb 24 03:52:56 crc kubenswrapper[4923]: I0224 03:52:56.203761 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_34d16b71-0cf5-4143-9225-3e44441dc2da/ovsdbserver-nb/0.log" Feb 24 03:52:56 crc kubenswrapper[4923]: I0224 03:52:56.312913 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c976efe6-239a-4f24-a392-b1b5ba3545de/openstack-network-exporter/0.log" Feb 24 03:52:56 crc kubenswrapper[4923]: I0224 03:52:56.356835 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c976efe6-239a-4f24-a392-b1b5ba3545de/ovsdbserver-sb/0.log" Feb 24 03:52:56 crc kubenswrapper[4923]: I0224 03:52:56.530970 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9d7999766-h8pkz_88a4bad2-fdbb-4186-b218-093ff0cf4b9c/placement-api/0.log" Feb 24 03:52:56 crc kubenswrapper[4923]: I0224 03:52:56.654549 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9d7999766-h8pkz_88a4bad2-fdbb-4186-b218-093ff0cf4b9c/placement-log/0.log" Feb 24 03:52:56 crc kubenswrapper[4923]: I0224 03:52:56.740426 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6e4608b5-cf65-4bbc-b509-85261127fe10/setup-container/0.log" Feb 24 03:52:56 crc kubenswrapper[4923]: I0224 03:52:56.979239 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6e4608b5-cf65-4bbc-b509-85261127fe10/rabbitmq/0.log" Feb 24 03:52:56 crc kubenswrapper[4923]: I0224 03:52:56.994034 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6e4608b5-cf65-4bbc-b509-85261127fe10/setup-container/0.log" Feb 24 03:52:57 crc kubenswrapper[4923]: I0224 03:52:57.016094 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4bd51e0b-15c9-4042-ac7e-c05ed0a11374/setup-container/0.log" Feb 24 03:52:57 crc kubenswrapper[4923]: I0224 03:52:57.272385 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-976cg_3444189b-88ae-469b-810d-e92a9a0c17d8/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 03:52:57 crc kubenswrapper[4923]: I0224 03:52:57.305357 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4bd51e0b-15c9-4042-ac7e-c05ed0a11374/setup-container/0.log" Feb 24 03:52:57 crc kubenswrapper[4923]: I0224 03:52:57.309030 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4bd51e0b-15c9-4042-ac7e-c05ed0a11374/rabbitmq/0.log" Feb 24 03:52:57 crc kubenswrapper[4923]: I0224 03:52:57.492924 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-r85nc_3816ebdb-67f1-4d77-835e-fd9323d883fd/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 03:52:57 crc kubenswrapper[4923]: I0224 03:52:57.589691 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6_d5eb03b7-77c3-4c05-a735-ce0c901c91cb/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 03:52:57 crc kubenswrapper[4923]: I0224 03:52:57.712769 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:52:57 crc kubenswrapper[4923]: E0224 03:52:57.713065 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:52:57 crc kubenswrapper[4923]: I0224 03:52:57.829364 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-f4pqm_a784fe18-eb1d-4e0d-84cb-9268b1904302/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 03:52:57 crc kubenswrapper[4923]: I0224 03:52:57.866812 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-m2jt2_b51afcd9-da3d-4f68-947a-c6af0a02cfaa/ssh-known-hosts-edpm-deployment/0.log" Feb 24 03:52:58 crc kubenswrapper[4923]: I0224 03:52:58.035740 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-645cdc8bdf-bkt49_28a2632f-7155-4c9e-9767-fcda3ff0688b/proxy-server/0.log" Feb 24 03:52:58 crc kubenswrapper[4923]: I0224 03:52:58.095831 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-645cdc8bdf-bkt49_28a2632f-7155-4c9e-9767-fcda3ff0688b/proxy-httpd/0.log" Feb 24 03:52:58 crc kubenswrapper[4923]: I0224 03:52:58.150815 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-84w4c_6113f2e8-dd3f-42d4-92f3-8fd56e4b458c/swift-ring-rebalance/0.log" Feb 24 03:52:58 crc kubenswrapper[4923]: I0224 03:52:58.316365 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/account-reaper/0.log" Feb 24 03:52:58 crc kubenswrapper[4923]: I0224 03:52:58.332531 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/account-auditor/0.log" Feb 24 03:52:58 crc kubenswrapper[4923]: I0224 03:52:58.402745 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/account-replicator/0.log" Feb 24 03:52:58 crc kubenswrapper[4923]: I0224 03:52:58.487910 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/container-auditor/0.log" Feb 24 03:52:58 crc kubenswrapper[4923]: I0224 03:52:58.505748 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/account-server/0.log" Feb 24 03:52:58 crc kubenswrapper[4923]: I0224 03:52:58.551085 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/container-replicator/0.log" Feb 24 03:52:58 crc kubenswrapper[4923]: I0224 03:52:58.706002 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/container-updater/0.log" Feb 24 03:52:58 crc kubenswrapper[4923]: I0224 03:52:58.711282 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/container-server/0.log" Feb 24 03:52:58 crc kubenswrapper[4923]: I0224 03:52:58.759399 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/object-auditor/0.log" Feb 24 03:52:58 crc kubenswrapper[4923]: I0224 03:52:58.795175 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/object-expirer/0.log" Feb 24 03:52:58 crc kubenswrapper[4923]: I0224 03:52:58.887041 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/object-server/0.log" Feb 24 03:52:58 crc kubenswrapper[4923]: I0224 03:52:58.922381 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/object-replicator/0.log" Feb 24 03:52:58 crc kubenswrapper[4923]: I0224 03:52:58.986338 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/object-updater/0.log" Feb 24 03:52:58 crc kubenswrapper[4923]: I0224 03:52:58.994852 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/rsync/0.log" Feb 24 03:52:59 crc kubenswrapper[4923]: I0224 03:52:59.102767 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/swift-recon-cron/0.log" Feb 24 03:52:59 crc kubenswrapper[4923]: I0224 03:52:59.223467 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8_d8be4d6b-1c52-43ae-addf-ad44faf403f2/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 03:52:59 crc kubenswrapper[4923]: I0224 03:52:59.329047 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_9b6f2b0b-f8d2-4a36-a1a2-177dcf809761/tempest-tests-tempest-tests-runner/0.log" Feb 24 03:52:59 crc kubenswrapper[4923]: I0224 03:52:59.443531 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_ff180506-c96d-4b80-8568-e972c702ff06/test-operator-logs-container/0.log" Feb 24 03:52:59 crc kubenswrapper[4923]: I0224 03:52:59.528956 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8_890d3a1a-7dcc-4033-95c0-a3507815e8ff/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 03:53:08 crc kubenswrapper[4923]: I0224 03:53:08.967241 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_2c9c3801-205a-40fd-929f-587f5aaa9ca2/memcached/0.log" Feb 24 03:53:09 crc kubenswrapper[4923]: I0224 03:53:09.712713 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:53:09 crc kubenswrapper[4923]: E0224 03:53:09.713057 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 03:53:23 crc kubenswrapper[4923]: I0224 03:53:23.714543 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:53:24 crc kubenswrapper[4923]: I0224 03:53:24.522122 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t_0fd27016-8094-45ad-9298-4f33e7692b7e/util/0.log" Feb 24 03:53:24 crc kubenswrapper[4923]: I0224 03:53:24.681484 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t_0fd27016-8094-45ad-9298-4f33e7692b7e/util/0.log" Feb 24 03:53:24 crc kubenswrapper[4923]: I0224 03:53:24.737485 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t_0fd27016-8094-45ad-9298-4f33e7692b7e/pull/0.log" Feb 24 03:53:24 crc kubenswrapper[4923]: I0224 03:53:24.737638 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t_0fd27016-8094-45ad-9298-4f33e7692b7e/pull/0.log" Feb 24 03:53:24 crc kubenswrapper[4923]: I0224 03:53:24.913416 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t_0fd27016-8094-45ad-9298-4f33e7692b7e/pull/0.log" Feb 24 03:53:24 crc kubenswrapper[4923]: I0224 03:53:24.921642 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t_0fd27016-8094-45ad-9298-4f33e7692b7e/util/0.log" Feb 24 03:53:24 crc kubenswrapper[4923]: I0224 03:53:24.934685 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerStarted","Data":"9fe9e45e3a82c3754daa7fe6ed42ef47bf61b7c7019ca6a9f39fabf754ac5291"} Feb 24 03:53:24 crc kubenswrapper[4923]: I0224 03:53:24.990033 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t_0fd27016-8094-45ad-9298-4f33e7692b7e/extract/0.log" Feb 24 03:53:25 crc kubenswrapper[4923]: I0224 03:53:25.428080 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-n2qf9_74dbbb69-5b9b-45b1-a74e-8bed20a6cbed/manager/0.log" Feb 24 03:53:25 crc kubenswrapper[4923]: I0224 03:53:25.876740 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-58z87_202e32ae-6025-43c3-90ee-d5a6ec2f7752/manager/0.log" Feb 24 03:53:25 crc kubenswrapper[4923]: I0224 03:53:25.955248 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-tbhxh_07169ece-c03c-464a-9899-f03b61426df5/manager/0.log" Feb 24 03:53:26 crc kubenswrapper[4923]: I0224 03:53:26.203023 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-2mdh9_2dda58b4-8524-47cf-9e31-f276859d0af1/manager/0.log" Feb 24 03:53:26 crc kubenswrapper[4923]: I0224 03:53:26.720992 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-msrg6_85db5c1e-a62f-496a-a8ce-0e32d4321ac9/manager/0.log" Feb 24 03:53:26 crc kubenswrapper[4923]: I0224 03:53:26.761033 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-pd2zn_1dc2eca6-9aa4-4b34-a30c-bb88a78fe1d1/manager/0.log" Feb 24 03:53:26 crc kubenswrapper[4923]: I0224 03:53:26.864916 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-bmn7v_26fc13d5-b98a-49ac-8c62-ee3ab08a9767/manager/0.log" Feb 24 03:53:27 crc kubenswrapper[4923]: I0224 03:53:27.067465 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-tqvjv_6a78564f-37a8-4385-9f93-57ee3952d36c/manager/0.log" Feb 24 03:53:27 crc kubenswrapper[4923]: I0224 03:53:27.250153 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-4d7n7_7228fc47-38cb-4680-9104-d5657a853147/manager/0.log" Feb 24 03:53:27 crc kubenswrapper[4923]: I0224 03:53:27.369692 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-4c6hc_400c9c7a-a90c-4b16-b13d-25c26be22f93/manager/0.log" Feb 24 03:53:27 crc kubenswrapper[4923]: I0224 03:53:27.577791 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-x6vmb_039de08e-513c-47e3-a3f7-59b8911b7dae/manager/0.log" Feb 24 03:53:27 crc kubenswrapper[4923]: I0224 03:53:27.959746 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-r64q5_31f694c1-3948-4e87-90d1-5bd1e7d0aef6/manager/0.log" Feb 24 03:53:28 crc kubenswrapper[4923]: I0224 03:53:28.003696 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-c8n7r_063cbd60-dc19-4c47-96ca-7b9cb24bf2ef/manager/0.log" Feb 24 03:53:28 crc kubenswrapper[4923]: I0224 03:53:28.243010 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw_8c5a7840-9e6b-4442-b99e-1ce50bff0722/manager/0.log" Feb 24 03:53:28 crc kubenswrapper[4923]: I0224 03:53:28.645448 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5677cd7d77-l7zkx_357fa046-8d84-43b5-8e3c-d1fe18f5d2c5/operator/0.log" Feb 24 03:53:28 crc kubenswrapper[4923]: I0224 03:53:28.772237 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zscrj_d988ae64-eb6b-4f86-a51d-5b61eb8b6d35/registry-server/0.log" Feb 24 03:53:29 crc kubenswrapper[4923]: I0224 03:53:29.002887 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-2zxl4_8b9a0e9e-0ef9-4b69-87f3-63cfb4204996/manager/0.log" Feb 24 03:53:29 crc kubenswrapper[4923]: I0224 03:53:29.099973 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-m8tjf_d368aeb3-8ad3-4ed2-8022-6dfb08b2f86e/manager/0.log" Feb 24 03:53:29 crc kubenswrapper[4923]: I0224 03:53:29.244027 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-vczgc_7e46cf81-12eb-4c37-9f04-affbd9f153b7/operator/0.log" Feb 24 03:53:29 crc kubenswrapper[4923]: I0224 03:53:29.439189 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-zxhc7_a9542edd-79bc-4eb0-8afb-7e9c61a8fb6e/manager/0.log" Feb 24 03:53:29 crc kubenswrapper[4923]: I0224 03:53:29.650164 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-589c568786-f4b5c_4981f2a3-3977-40b7-819b-59cf400fa882/manager/0.log" Feb 24 03:53:29 crc kubenswrapper[4923]: I0224 03:53:29.738476 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-5pzgk_4ba7a21e-9aef-4596-9e18-21c66394cf74/manager/0.log" Feb 24 03:53:29 crc kubenswrapper[4923]: I0224 03:53:29.890112 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-2bgtt_ae81debf-3361-424d-afbe-9e4521997d23/manager/0.log" Feb 24 03:53:30 crc kubenswrapper[4923]: I0224 03:53:30.143917 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5dd698895-qrccn_6b2c8692-7382-4716-8770-47d21209898f/manager/0.log" Feb 24 03:53:31 crc kubenswrapper[4923]: I0224 03:53:31.405656 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-lnz72_13de02b1-8017-4d32-b848-08d241ef34d4/manager/0.log" Feb 24 03:53:49 crc kubenswrapper[4923]: I0224 03:53:49.536958 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-rx89l_34d9b54c-a37d-407b-81c7-ff77a96b7dd8/control-plane-machine-set-operator/0.log" Feb 24 03:53:49 crc kubenswrapper[4923]: I0224 03:53:49.742334 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-82dxq_ae174d22-78c6-4699-9d9b-8ce566dc9f4c/kube-rbac-proxy/0.log" Feb 24 03:53:49 crc kubenswrapper[4923]: I0224 03:53:49.778196 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-82dxq_ae174d22-78c6-4699-9d9b-8ce566dc9f4c/machine-api-operator/0.log" Feb 24 03:54:02 crc kubenswrapper[4923]: I0224 03:54:02.746040 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-pbnww_779174d3-c69e-46be-b5e4-4210a6697e7b/cert-manager-controller/0.log" Feb 24 03:54:02 crc kubenswrapper[4923]: I0224 03:54:02.908488 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-6qbrj_b6864f21-fdba-416a-b777-a492c9c9e66c/cert-manager-cainjector/0.log" Feb 24 03:54:02 crc kubenswrapper[4923]: I0224 03:54:02.967662 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-r9qzx_60be919e-8301-45ed-9e67-6e54e6ddef7f/cert-manager-webhook/0.log" Feb 24 03:54:15 crc kubenswrapper[4923]: I0224 03:54:15.657942 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-bxdm5_36d48cde-8247-4729-aa2d-d6b99d25b198/nmstate-console-plugin/0.log" Feb 24 03:54:15 crc kubenswrapper[4923]: I0224 03:54:15.833818 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-fp5bb_886309d8-6744-4d32-a729-225ef9679579/nmstate-handler/0.log" Feb 24 03:54:15 crc kubenswrapper[4923]: I0224 03:54:15.871317 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-mnljj_ae5580e9-8a55-4dbe-99c8-e21e22d3813e/kube-rbac-proxy/0.log" Feb 24 03:54:15 crc kubenswrapper[4923]: I0224 03:54:15.943523 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-mnljj_ae5580e9-8a55-4dbe-99c8-e21e22d3813e/nmstate-metrics/0.log" Feb 24 03:54:16 crc kubenswrapper[4923]: I0224 03:54:16.022690 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-k582j_0bbb266f-3eb2-4f05-8d57-9c6ac88c83fd/nmstate-operator/0.log" Feb 24 03:54:16 crc kubenswrapper[4923]: I0224 03:54:16.114469 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-25qp5_8192c6db-fbcf-45b6-b43c-313abcc10d2e/nmstate-webhook/0.log" Feb 24 03:54:42 crc kubenswrapper[4923]: I0224 03:54:42.438112 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-7bl75_ebf12b42-c896-4b13-954c-1ef5753c3fc0/kube-rbac-proxy/0.log" Feb 24 03:54:42 crc kubenswrapper[4923]: I0224 03:54:42.497104 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-7bl75_ebf12b42-c896-4b13-954c-1ef5753c3fc0/controller/0.log" Feb 24 03:54:42 crc kubenswrapper[4923]: I0224 03:54:42.694105 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-8hjgx_b9c601e3-2c93-4989-a6e9-20542436ace6/frr-k8s-webhook-server/0.log" Feb 24 03:54:42 crc kubenswrapper[4923]: I0224 03:54:42.846121 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-frr-files/0.log" Feb 24 03:54:42 crc kubenswrapper[4923]: I0224 03:54:42.987893 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-frr-files/0.log" Feb 24 03:54:42 crc kubenswrapper[4923]: I0224 03:54:42.993000 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-reloader/0.log" Feb 24 03:54:42 crc kubenswrapper[4923]: I0224 03:54:42.995250 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-metrics/0.log" Feb 24 03:54:43 crc kubenswrapper[4923]: I0224 03:54:43.044583 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-reloader/0.log" Feb 24 03:54:43 crc kubenswrapper[4923]: I0224 03:54:43.195143 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-frr-files/0.log" Feb 24 03:54:43 crc kubenswrapper[4923]: I0224 03:54:43.223459 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-reloader/0.log" Feb 24 03:54:43 crc kubenswrapper[4923]: I0224 03:54:43.230700 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-metrics/0.log" Feb 24 03:54:43 crc kubenswrapper[4923]: I0224 03:54:43.247655 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-metrics/0.log" Feb 24 03:54:43 crc kubenswrapper[4923]: I0224 03:54:43.435958 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-reloader/0.log" Feb 24 03:54:43 crc kubenswrapper[4923]: I0224 03:54:43.445204 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-frr-files/0.log" Feb 24 03:54:43 crc kubenswrapper[4923]: I0224 03:54:43.469762 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/controller/0.log" Feb 24 03:54:43 crc kubenswrapper[4923]: I0224 03:54:43.475286 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-metrics/0.log" Feb 24 03:54:43 crc kubenswrapper[4923]: I0224 03:54:43.682478 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/kube-rbac-proxy-frr/0.log" Feb 24 03:54:43 crc kubenswrapper[4923]: I0224 03:54:43.702895 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/kube-rbac-proxy/0.log" Feb 24 03:54:43 crc kubenswrapper[4923]: I0224 03:54:43.705105 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/frr-metrics/0.log" Feb 24 03:54:43 crc kubenswrapper[4923]: I0224 03:54:43.870945 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/reloader/0.log" Feb 24 03:54:43 crc kubenswrapper[4923]: I0224 03:54:43.943772 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84dbcb4757-pvzfq_475ff3bc-195d-4768-892d-1c0274b3a25c/manager/0.log" Feb 24 03:54:44 crc kubenswrapper[4923]: I0224 03:54:44.104542 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-79448499bc-9ssng_0fcd559c-f2fb-455a-9cc5-6be1cd6be98a/webhook-server/0.log" Feb 24 03:54:44 crc kubenswrapper[4923]: I0224 03:54:44.313766 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-k5l69_94515a6b-ba32-4b66-9cbf-42f9e0e38d14/kube-rbac-proxy/0.log" Feb 24 03:54:44 crc kubenswrapper[4923]: I0224 03:54:44.732045 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-k5l69_94515a6b-ba32-4b66-9cbf-42f9e0e38d14/speaker/0.log" Feb 24 03:54:44 crc kubenswrapper[4923]: I0224 03:54:44.946261 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/frr/0.log" Feb 24 03:54:57 crc kubenswrapper[4923]: I0224 03:54:57.956413 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5_69c2b6f1-7455-4c00-a61e-43ab85b9df97/util/0.log" Feb 24 03:54:58 crc kubenswrapper[4923]: I0224 03:54:58.190964 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5_69c2b6f1-7455-4c00-a61e-43ab85b9df97/pull/0.log" Feb 24 03:54:58 crc kubenswrapper[4923]: I0224 03:54:58.197512 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5_69c2b6f1-7455-4c00-a61e-43ab85b9df97/pull/0.log" Feb 24 03:54:58 crc kubenswrapper[4923]: I0224 03:54:58.215760 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5_69c2b6f1-7455-4c00-a61e-43ab85b9df97/util/0.log" Feb 24 03:54:58 crc kubenswrapper[4923]: I0224 03:54:58.376480 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5_69c2b6f1-7455-4c00-a61e-43ab85b9df97/pull/0.log" Feb 24 03:54:58 crc kubenswrapper[4923]: I0224 03:54:58.405418 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5_69c2b6f1-7455-4c00-a61e-43ab85b9df97/util/0.log" Feb 24 03:54:58 crc kubenswrapper[4923]: I0224 03:54:58.449182 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5_69c2b6f1-7455-4c00-a61e-43ab85b9df97/extract/0.log" Feb 24 03:54:58 crc kubenswrapper[4923]: I0224 03:54:58.581962 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8gng_5d03834e-9594-476a-a7a9-1bf1aa9ade01/extract-utilities/0.log" Feb 24 03:54:58 crc kubenswrapper[4923]: I0224 03:54:58.767393 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8gng_5d03834e-9594-476a-a7a9-1bf1aa9ade01/extract-content/0.log" Feb 24 03:54:58 crc kubenswrapper[4923]: I0224 03:54:58.807312 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8gng_5d03834e-9594-476a-a7a9-1bf1aa9ade01/extract-utilities/0.log" Feb 24 03:54:58 crc kubenswrapper[4923]: I0224 03:54:58.833103 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8gng_5d03834e-9594-476a-a7a9-1bf1aa9ade01/extract-content/0.log" Feb 24 03:54:59 crc kubenswrapper[4923]: I0224 03:54:59.028280 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8gng_5d03834e-9594-476a-a7a9-1bf1aa9ade01/extract-utilities/0.log" Feb 24 03:54:59 crc kubenswrapper[4923]: I0224 03:54:59.028851 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8gng_5d03834e-9594-476a-a7a9-1bf1aa9ade01/extract-content/0.log" Feb 24 03:54:59 crc kubenswrapper[4923]: I0224 03:54:59.210232 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgg4b_1ddcf8ff-1207-46ff-9dde-08219d670309/extract-utilities/0.log" Feb 24 03:54:59 crc kubenswrapper[4923]: I0224 03:54:59.433874 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgg4b_1ddcf8ff-1207-46ff-9dde-08219d670309/extract-content/0.log" Feb 24 03:54:59 crc kubenswrapper[4923]: I0224 03:54:59.444805 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgg4b_1ddcf8ff-1207-46ff-9dde-08219d670309/extract-utilities/0.log" Feb 24 03:54:59 crc kubenswrapper[4923]: I0224 03:54:59.534130 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8gng_5d03834e-9594-476a-a7a9-1bf1aa9ade01/registry-server/0.log" Feb 24 03:54:59 crc kubenswrapper[4923]: I0224 03:54:59.594514 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgg4b_1ddcf8ff-1207-46ff-9dde-08219d670309/extract-content/0.log" Feb 24 03:54:59 crc kubenswrapper[4923]: I0224 03:54:59.657639 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgg4b_1ddcf8ff-1207-46ff-9dde-08219d670309/extract-utilities/0.log" Feb 24 03:54:59 crc kubenswrapper[4923]: I0224 03:54:59.664953 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgg4b_1ddcf8ff-1207-46ff-9dde-08219d670309/extract-content/0.log" Feb 24 03:54:59 crc kubenswrapper[4923]: I0224 03:54:59.910934 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr_01359d36-26ee-45ab-83f1-2cc5a8f360be/util/0.log" Feb 24 03:55:00 crc kubenswrapper[4923]: I0224 03:55:00.089873 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr_01359d36-26ee-45ab-83f1-2cc5a8f360be/util/0.log" Feb 24 03:55:00 crc kubenswrapper[4923]: I0224 03:55:00.104492 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr_01359d36-26ee-45ab-83f1-2cc5a8f360be/pull/0.log" Feb 24 03:55:00 crc kubenswrapper[4923]: I0224 03:55:00.138666 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr_01359d36-26ee-45ab-83f1-2cc5a8f360be/pull/0.log" Feb 24 03:55:00 crc kubenswrapper[4923]: I0224 03:55:00.250319 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgg4b_1ddcf8ff-1207-46ff-9dde-08219d670309/registry-server/0.log" Feb 24 03:55:00 crc kubenswrapper[4923]: I0224 03:55:00.359457 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr_01359d36-26ee-45ab-83f1-2cc5a8f360be/util/0.log" Feb 24 03:55:00 crc kubenswrapper[4923]: I0224 03:55:00.384655 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr_01359d36-26ee-45ab-83f1-2cc5a8f360be/pull/0.log" Feb 24 03:55:00 crc kubenswrapper[4923]: I0224 03:55:00.387307 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr_01359d36-26ee-45ab-83f1-2cc5a8f360be/extract/0.log" Feb 24 03:55:00 crc kubenswrapper[4923]: I0224 03:55:00.538042 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-f8kx9_05612e34-43ff-4719-9bb6-46364400281f/marketplace-operator/0.log" Feb 24 03:55:00 crc kubenswrapper[4923]: I0224 03:55:00.694083 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s2s2b_3ade27f6-4909-4b58-b29d-d7b74686d166/extract-utilities/0.log" Feb 24 03:55:00 crc kubenswrapper[4923]: I0224 03:55:00.926931 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s2s2b_3ade27f6-4909-4b58-b29d-d7b74686d166/extract-content/0.log" Feb 24 03:55:00 crc kubenswrapper[4923]: I0224 03:55:00.927506 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s2s2b_3ade27f6-4909-4b58-b29d-d7b74686d166/extract-content/0.log" Feb 24 03:55:00 crc kubenswrapper[4923]: I0224 03:55:00.943665 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s2s2b_3ade27f6-4909-4b58-b29d-d7b74686d166/extract-utilities/0.log" Feb 24 03:55:01 crc kubenswrapper[4923]: I0224 03:55:01.052735 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s2s2b_3ade27f6-4909-4b58-b29d-d7b74686d166/extract-utilities/0.log" Feb 24 03:55:01 crc kubenswrapper[4923]: I0224 03:55:01.129813 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s2s2b_3ade27f6-4909-4b58-b29d-d7b74686d166/extract-content/0.log" Feb 24 03:55:01 crc kubenswrapper[4923]: I0224 03:55:01.217468 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s2s2b_3ade27f6-4909-4b58-b29d-d7b74686d166/registry-server/0.log" Feb 24 03:55:01 crc kubenswrapper[4923]: I0224 03:55:01.285268 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8kn_3bfb8ad5-974b-4507-96cf-1150c1ca8937/extract-utilities/0.log" Feb 24 03:55:01 crc kubenswrapper[4923]: I0224 03:55:01.466470 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8kn_3bfb8ad5-974b-4507-96cf-1150c1ca8937/extract-content/0.log" Feb 24 03:55:01 crc kubenswrapper[4923]: I0224 03:55:01.468572 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8kn_3bfb8ad5-974b-4507-96cf-1150c1ca8937/extract-utilities/0.log" Feb 24 03:55:01 crc kubenswrapper[4923]: I0224 03:55:01.489653 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8kn_3bfb8ad5-974b-4507-96cf-1150c1ca8937/extract-content/0.log" Feb 24 03:55:01 crc kubenswrapper[4923]: I0224 03:55:01.688634 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8kn_3bfb8ad5-974b-4507-96cf-1150c1ca8937/extract-utilities/0.log" Feb 24 03:55:01 crc kubenswrapper[4923]: I0224 03:55:01.717833 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8kn_3bfb8ad5-974b-4507-96cf-1150c1ca8937/extract-content/0.log" Feb 24 03:55:02 crc kubenswrapper[4923]: I0224 03:55:02.219078 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8kn_3bfb8ad5-974b-4507-96cf-1150c1ca8937/registry-server/0.log" Feb 24 03:55:20 crc kubenswrapper[4923]: E0224 03:55:20.765968 4923 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.194:47488->38.102.83.194:35219: write tcp 38.102.83.194:47488->38.102.83.194:35219: write: broken pipe Feb 24 03:55:49 crc kubenswrapper[4923]: I0224 03:55:49.916584 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:55:49 crc kubenswrapper[4923]: I0224 03:55:49.917059 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:56:19 crc kubenswrapper[4923]: I0224 03:56:19.916339 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:56:19 crc kubenswrapper[4923]: I0224 03:56:19.917232 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:56:45 crc kubenswrapper[4923]: I0224 03:56:45.931560 4923 generic.go:334] "Generic (PLEG): container finished" podID="fa9f5cc7-6960-477e-8d16-beaa00b35881" containerID="91b814c87f245e94518f5c0ac2d0376d189a1f0b5fc4fa8642615018fb25d273" exitCode=0 Feb 24 03:56:45 crc kubenswrapper[4923]: I0224 03:56:45.932167 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zvvp7/must-gather-blt9n" event={"ID":"fa9f5cc7-6960-477e-8d16-beaa00b35881","Type":"ContainerDied","Data":"91b814c87f245e94518f5c0ac2d0376d189a1f0b5fc4fa8642615018fb25d273"} Feb 24 03:56:45 crc kubenswrapper[4923]: I0224 03:56:45.933756 4923 scope.go:117] "RemoveContainer" containerID="91b814c87f245e94518f5c0ac2d0376d189a1f0b5fc4fa8642615018fb25d273" Feb 24 03:56:46 crc kubenswrapper[4923]: I0224 03:56:46.942995 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zvvp7_must-gather-blt9n_fa9f5cc7-6960-477e-8d16-beaa00b35881/gather/0.log" Feb 24 03:56:49 crc kubenswrapper[4923]: I0224 03:56:49.916244 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:56:49 crc kubenswrapper[4923]: I0224 03:56:49.918349 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:56:49 crc kubenswrapper[4923]: I0224 03:56:49.918589 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 03:56:49 crc kubenswrapper[4923]: I0224 03:56:49.920023 4923 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9fe9e45e3a82c3754daa7fe6ed42ef47bf61b7c7019ca6a9f39fabf754ac5291"} pod="openshift-machine-config-operator/machine-config-daemon-rh26t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 03:56:49 crc kubenswrapper[4923]: I0224 03:56:49.920486 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" containerID="cri-o://9fe9e45e3a82c3754daa7fe6ed42ef47bf61b7c7019ca6a9f39fabf754ac5291" gracePeriod=600 Feb 24 03:56:50 crc kubenswrapper[4923]: I0224 03:56:50.989070 4923 generic.go:334] "Generic (PLEG): container finished" podID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerID="9fe9e45e3a82c3754daa7fe6ed42ef47bf61b7c7019ca6a9f39fabf754ac5291" exitCode=0 Feb 24 03:56:50 crc kubenswrapper[4923]: I0224 03:56:50.989151 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerDied","Data":"9fe9e45e3a82c3754daa7fe6ed42ef47bf61b7c7019ca6a9f39fabf754ac5291"} Feb 24 03:56:50 crc kubenswrapper[4923]: I0224 03:56:50.989876 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerStarted","Data":"4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3"} Feb 24 03:56:50 crc kubenswrapper[4923]: I0224 03:56:50.989920 4923 scope.go:117] "RemoveContainer" containerID="4176cd9678443c0e80e2a1e4607755ffc8c115edf2cdd26232da15cf996b34b5" Feb 24 03:56:54 crc kubenswrapper[4923]: I0224 03:56:54.973336 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zvvp7/must-gather-blt9n"] Feb 24 03:56:54 crc kubenswrapper[4923]: I0224 03:56:54.974160 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-zvvp7/must-gather-blt9n" podUID="fa9f5cc7-6960-477e-8d16-beaa00b35881" containerName="copy" containerID="cri-o://856a1e819c7e9a2fd2d0bf5fe442305e16b51f46604a68dfaec4a39e21938f7e" gracePeriod=2 Feb 24 03:56:54 crc kubenswrapper[4923]: I0224 03:56:54.983590 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zvvp7/must-gather-blt9n"] Feb 24 03:56:55 crc kubenswrapper[4923]: I0224 03:56:55.535466 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zvvp7_must-gather-blt9n_fa9f5cc7-6960-477e-8d16-beaa00b35881/copy/0.log" Feb 24 03:56:55 crc kubenswrapper[4923]: I0224 03:56:55.536004 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvvp7/must-gather-blt9n" Feb 24 03:56:55 crc kubenswrapper[4923]: I0224 03:56:55.619984 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48qn6\" (UniqueName: \"kubernetes.io/projected/fa9f5cc7-6960-477e-8d16-beaa00b35881-kube-api-access-48qn6\") pod \"fa9f5cc7-6960-477e-8d16-beaa00b35881\" (UID: \"fa9f5cc7-6960-477e-8d16-beaa00b35881\") " Feb 24 03:56:55 crc kubenswrapper[4923]: I0224 03:56:55.620149 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fa9f5cc7-6960-477e-8d16-beaa00b35881-must-gather-output\") pod \"fa9f5cc7-6960-477e-8d16-beaa00b35881\" (UID: \"fa9f5cc7-6960-477e-8d16-beaa00b35881\") " Feb 24 03:56:55 crc kubenswrapper[4923]: I0224 03:56:55.627345 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa9f5cc7-6960-477e-8d16-beaa00b35881-kube-api-access-48qn6" (OuterVolumeSpecName: "kube-api-access-48qn6") pod "fa9f5cc7-6960-477e-8d16-beaa00b35881" (UID: "fa9f5cc7-6960-477e-8d16-beaa00b35881"). InnerVolumeSpecName "kube-api-access-48qn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 03:56:55 crc kubenswrapper[4923]: I0224 03:56:55.722394 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48qn6\" (UniqueName: \"kubernetes.io/projected/fa9f5cc7-6960-477e-8d16-beaa00b35881-kube-api-access-48qn6\") on node \"crc\" DevicePath \"\"" Feb 24 03:56:55 crc kubenswrapper[4923]: I0224 03:56:55.778597 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa9f5cc7-6960-477e-8d16-beaa00b35881-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "fa9f5cc7-6960-477e-8d16-beaa00b35881" (UID: "fa9f5cc7-6960-477e-8d16-beaa00b35881"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 03:56:55 crc kubenswrapper[4923]: I0224 03:56:55.824438 4923 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fa9f5cc7-6960-477e-8d16-beaa00b35881-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 24 03:56:56 crc kubenswrapper[4923]: I0224 03:56:56.051335 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zvvp7_must-gather-blt9n_fa9f5cc7-6960-477e-8d16-beaa00b35881/copy/0.log" Feb 24 03:56:56 crc kubenswrapper[4923]: I0224 03:56:56.051840 4923 generic.go:334] "Generic (PLEG): container finished" podID="fa9f5cc7-6960-477e-8d16-beaa00b35881" containerID="856a1e819c7e9a2fd2d0bf5fe442305e16b51f46604a68dfaec4a39e21938f7e" exitCode=143 Feb 24 03:56:56 crc kubenswrapper[4923]: I0224 03:56:56.051912 4923 scope.go:117] "RemoveContainer" containerID="856a1e819c7e9a2fd2d0bf5fe442305e16b51f46604a68dfaec4a39e21938f7e" Feb 24 03:56:56 crc kubenswrapper[4923]: I0224 03:56:56.051963 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zvvp7/must-gather-blt9n" Feb 24 03:56:56 crc kubenswrapper[4923]: I0224 03:56:56.080349 4923 scope.go:117] "RemoveContainer" containerID="91b814c87f245e94518f5c0ac2d0376d189a1f0b5fc4fa8642615018fb25d273" Feb 24 03:56:56 crc kubenswrapper[4923]: I0224 03:56:56.150574 4923 scope.go:117] "RemoveContainer" containerID="856a1e819c7e9a2fd2d0bf5fe442305e16b51f46604a68dfaec4a39e21938f7e" Feb 24 03:56:56 crc kubenswrapper[4923]: E0224 03:56:56.151247 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"856a1e819c7e9a2fd2d0bf5fe442305e16b51f46604a68dfaec4a39e21938f7e\": container with ID starting with 856a1e819c7e9a2fd2d0bf5fe442305e16b51f46604a68dfaec4a39e21938f7e not found: ID does not exist" containerID="856a1e819c7e9a2fd2d0bf5fe442305e16b51f46604a68dfaec4a39e21938f7e" Feb 24 03:56:56 crc kubenswrapper[4923]: I0224 03:56:56.151284 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"856a1e819c7e9a2fd2d0bf5fe442305e16b51f46604a68dfaec4a39e21938f7e"} err="failed to get container status \"856a1e819c7e9a2fd2d0bf5fe442305e16b51f46604a68dfaec4a39e21938f7e\": rpc error: code = NotFound desc = could not find container \"856a1e819c7e9a2fd2d0bf5fe442305e16b51f46604a68dfaec4a39e21938f7e\": container with ID starting with 856a1e819c7e9a2fd2d0bf5fe442305e16b51f46604a68dfaec4a39e21938f7e not found: ID does not exist" Feb 24 03:56:56 crc kubenswrapper[4923]: I0224 03:56:56.151361 4923 scope.go:117] "RemoveContainer" containerID="91b814c87f245e94518f5c0ac2d0376d189a1f0b5fc4fa8642615018fb25d273" Feb 24 03:56:56 crc kubenswrapper[4923]: E0224 03:56:56.151807 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91b814c87f245e94518f5c0ac2d0376d189a1f0b5fc4fa8642615018fb25d273\": container with ID starting with 91b814c87f245e94518f5c0ac2d0376d189a1f0b5fc4fa8642615018fb25d273 not found: ID does not exist" containerID="91b814c87f245e94518f5c0ac2d0376d189a1f0b5fc4fa8642615018fb25d273" Feb 24 03:56:56 crc kubenswrapper[4923]: I0224 03:56:56.151830 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91b814c87f245e94518f5c0ac2d0376d189a1f0b5fc4fa8642615018fb25d273"} err="failed to get container status \"91b814c87f245e94518f5c0ac2d0376d189a1f0b5fc4fa8642615018fb25d273\": rpc error: code = NotFound desc = could not find container \"91b814c87f245e94518f5c0ac2d0376d189a1f0b5fc4fa8642615018fb25d273\": container with ID starting with 91b814c87f245e94518f5c0ac2d0376d189a1f0b5fc4fa8642615018fb25d273 not found: ID does not exist" Feb 24 03:56:57 crc kubenswrapper[4923]: I0224 03:56:57.728732 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa9f5cc7-6960-477e-8d16-beaa00b35881" path="/var/lib/kubelet/pods/fa9f5cc7-6960-477e-8d16-beaa00b35881/volumes" Feb 24 03:56:58 crc kubenswrapper[4923]: I0224 03:56:58.856759 4923 scope.go:117] "RemoveContainer" containerID="a65fd811bde2209053f67e66ef136b0f10e2ea4ffc9f73afb32488a5a9a421c8" Feb 24 03:56:58 crc kubenswrapper[4923]: I0224 03:56:58.891137 4923 scope.go:117] "RemoveContainer" containerID="cb6c9706d9139e8f0d293ff73b10df2e6835429d523446eef06d1d6a1e4b7181" Feb 24 03:56:58 crc kubenswrapper[4923]: I0224 03:56:58.918627 4923 scope.go:117] "RemoveContainer" containerID="0e43d42f14014c4a31070a1b89e194077686225407f1421dd8d1f22173ce2b4f" Feb 24 03:57:59 crc kubenswrapper[4923]: I0224 03:57:59.024177 4923 scope.go:117] "RemoveContainer" containerID="e88786dcadd62521a93e23f2eed0f352c140b9dde3909c720cd0e8f17be47f2a" Feb 24 03:57:59 crc kubenswrapper[4923]: I0224 03:57:59.049434 4923 scope.go:117] "RemoveContainer" containerID="78f4a8c69e7150eeba2a4f6846788bfe79e7ec5c8f4b66ac838547a014b40e0b" Feb 24 03:57:59 crc kubenswrapper[4923]: I0224 03:57:59.110814 4923 scope.go:117] "RemoveContainer" containerID="c96374c0efb10e284b6571cf620e61f89b9f33573841d6ecb5d6e5f7b29c9d6a" Feb 24 03:57:59 crc kubenswrapper[4923]: I0224 03:57:59.151809 4923 scope.go:117] "RemoveContainer" containerID="a556c89def9637f1167713ec3e304fe89dd70fac2a6bdb7d93cbc78ec804844b" Feb 24 03:59:19 crc kubenswrapper[4923]: I0224 03:59:19.916484 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:59:19 crc kubenswrapper[4923]: I0224 03:59:19.916961 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:59:30 crc kubenswrapper[4923]: I0224 03:59:30.347387 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5hz2x" Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.242340 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lbw7w"] Feb 24 03:59:45 crc kubenswrapper[4923]: E0224 03:59:45.243265 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec11254-9351-4e31-8176-8491c229c0c4" containerName="container-00" Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.243278 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec11254-9351-4e31-8176-8491c229c0c4" containerName="container-00" Feb 24 03:59:45 crc kubenswrapper[4923]: E0224 03:59:45.243316 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9f5cc7-6960-477e-8d16-beaa00b35881" containerName="copy" Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.243322 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9f5cc7-6960-477e-8d16-beaa00b35881" containerName="copy" Feb 24 03:59:45 crc kubenswrapper[4923]: E0224 03:59:45.243335 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9f5cc7-6960-477e-8d16-beaa00b35881" containerName="gather" Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.243342 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9f5cc7-6960-477e-8d16-beaa00b35881" containerName="gather" Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.245324 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa9f5cc7-6960-477e-8d16-beaa00b35881" containerName="copy" Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.245356 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa9f5cc7-6960-477e-8d16-beaa00b35881" containerName="gather" Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.245364 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec11254-9351-4e31-8176-8491c229c0c4" containerName="container-00" Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.246755 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lbw7w" Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.260059 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b46fceca-15bf-4ca0-b015-7273e44de29a-utilities\") pod \"redhat-operators-lbw7w\" (UID: \"b46fceca-15bf-4ca0-b015-7273e44de29a\") " pod="openshift-marketplace/redhat-operators-lbw7w" Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.260438 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvrln\" (UniqueName: \"kubernetes.io/projected/b46fceca-15bf-4ca0-b015-7273e44de29a-kube-api-access-rvrln\") pod \"redhat-operators-lbw7w\" (UID: \"b46fceca-15bf-4ca0-b015-7273e44de29a\") " pod="openshift-marketplace/redhat-operators-lbw7w" Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.260666 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b46fceca-15bf-4ca0-b015-7273e44de29a-catalog-content\") pod \"redhat-operators-lbw7w\" (UID: \"b46fceca-15bf-4ca0-b015-7273e44de29a\") " pod="openshift-marketplace/redhat-operators-lbw7w" Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.263102 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lbw7w"] Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.363660 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b46fceca-15bf-4ca0-b015-7273e44de29a-catalog-content\") pod \"redhat-operators-lbw7w\" (UID: \"b46fceca-15bf-4ca0-b015-7273e44de29a\") " pod="openshift-marketplace/redhat-operators-lbw7w" Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.363927 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b46fceca-15bf-4ca0-b015-7273e44de29a-utilities\") pod \"redhat-operators-lbw7w\" (UID: \"b46fceca-15bf-4ca0-b015-7273e44de29a\") " pod="openshift-marketplace/redhat-operators-lbw7w" Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.363994 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvrln\" (UniqueName: \"kubernetes.io/projected/b46fceca-15bf-4ca0-b015-7273e44de29a-kube-api-access-rvrln\") pod \"redhat-operators-lbw7w\" (UID: \"b46fceca-15bf-4ca0-b015-7273e44de29a\") " pod="openshift-marketplace/redhat-operators-lbw7w" Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.364113 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b46fceca-15bf-4ca0-b015-7273e44de29a-catalog-content\") pod \"redhat-operators-lbw7w\" (UID: \"b46fceca-15bf-4ca0-b015-7273e44de29a\") " pod="openshift-marketplace/redhat-operators-lbw7w" Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.364403 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b46fceca-15bf-4ca0-b015-7273e44de29a-utilities\") pod \"redhat-operators-lbw7w\" (UID: \"b46fceca-15bf-4ca0-b015-7273e44de29a\") " pod="openshift-marketplace/redhat-operators-lbw7w" Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.393386 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvrln\" (UniqueName: \"kubernetes.io/projected/b46fceca-15bf-4ca0-b015-7273e44de29a-kube-api-access-rvrln\") pod \"redhat-operators-lbw7w\" (UID: \"b46fceca-15bf-4ca0-b015-7273e44de29a\") " pod="openshift-marketplace/redhat-operators-lbw7w" Feb 24 03:59:45 crc kubenswrapper[4923]: I0224 03:59:45.616146 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lbw7w" Feb 24 03:59:46 crc kubenswrapper[4923]: I0224 03:59:46.099591 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lbw7w"] Feb 24 03:59:46 crc kubenswrapper[4923]: I0224 03:59:46.851678 4923 generic.go:334] "Generic (PLEG): container finished" podID="b46fceca-15bf-4ca0-b015-7273e44de29a" containerID="86e0a6e4ff4f78e4c5591b30a831444064ff5985509041a5b2441a2679a5d347" exitCode=0 Feb 24 03:59:46 crc kubenswrapper[4923]: I0224 03:59:46.851734 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbw7w" event={"ID":"b46fceca-15bf-4ca0-b015-7273e44de29a","Type":"ContainerDied","Data":"86e0a6e4ff4f78e4c5591b30a831444064ff5985509041a5b2441a2679a5d347"} Feb 24 03:59:46 crc kubenswrapper[4923]: I0224 03:59:46.851771 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbw7w" event={"ID":"b46fceca-15bf-4ca0-b015-7273e44de29a","Type":"ContainerStarted","Data":"0c2bc7daaf95c896a25da1cbf1e4639a478408c4ed85e189e90e8873c897c3a0"} Feb 24 03:59:46 crc kubenswrapper[4923]: I0224 03:59:46.854194 4923 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 03:59:47 crc kubenswrapper[4923]: I0224 03:59:47.863899 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbw7w" event={"ID":"b46fceca-15bf-4ca0-b015-7273e44de29a","Type":"ContainerStarted","Data":"4461a39d8875f1add0cd2c1363f5137173950d818e85d4145be0fbc16d65198a"} Feb 24 03:59:48 crc kubenswrapper[4923]: I0224 03:59:48.872843 4923 generic.go:334] "Generic (PLEG): container finished" podID="b46fceca-15bf-4ca0-b015-7273e44de29a" containerID="4461a39d8875f1add0cd2c1363f5137173950d818e85d4145be0fbc16d65198a" exitCode=0 Feb 24 03:59:48 crc kubenswrapper[4923]: I0224 03:59:48.872962 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbw7w" event={"ID":"b46fceca-15bf-4ca0-b015-7273e44de29a","Type":"ContainerDied","Data":"4461a39d8875f1add0cd2c1363f5137173950d818e85d4145be0fbc16d65198a"} Feb 24 03:59:49 crc kubenswrapper[4923]: I0224 03:59:49.885021 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbw7w" event={"ID":"b46fceca-15bf-4ca0-b015-7273e44de29a","Type":"ContainerStarted","Data":"8e98c43495be7af3ab17ccb797cf9b7f03295d99ae2a28eb4b18810bbf30b04b"} Feb 24 03:59:49 crc kubenswrapper[4923]: I0224 03:59:49.910252 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lbw7w" podStartSLOduration=2.513922108 podStartE2EDuration="4.91023634s" podCreationTimestamp="2026-02-24 03:59:45 +0000 UTC" firstStartedPulling="2026-02-24 03:59:46.854002997 +0000 UTC m=+3910.871073810" lastFinishedPulling="2026-02-24 03:59:49.250317219 +0000 UTC m=+3913.267388042" observedRunningTime="2026-02-24 03:59:49.908157536 +0000 UTC m=+3913.925228359" watchObservedRunningTime="2026-02-24 03:59:49.91023634 +0000 UTC m=+3913.927307153" Feb 24 03:59:49 crc kubenswrapper[4923]: I0224 03:59:49.916765 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 03:59:49 crc kubenswrapper[4923]: I0224 03:59:49.916822 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 03:59:54 crc kubenswrapper[4923]: I0224 03:59:54.473645 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-przsk/must-gather-w6qlt"] Feb 24 03:59:54 crc kubenswrapper[4923]: I0224 03:59:54.475246 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-przsk/must-gather-w6qlt" Feb 24 03:59:54 crc kubenswrapper[4923]: I0224 03:59:54.478204 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-przsk"/"openshift-service-ca.crt" Feb 24 03:59:54 crc kubenswrapper[4923]: I0224 03:59:54.478240 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-przsk"/"kube-root-ca.crt" Feb 24 03:59:54 crc kubenswrapper[4923]: I0224 03:59:54.478331 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-przsk"/"default-dockercfg-96t2r" Feb 24 03:59:54 crc kubenswrapper[4923]: I0224 03:59:54.486220 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-przsk/must-gather-w6qlt"] Feb 24 03:59:54 crc kubenswrapper[4923]: I0224 03:59:54.593930 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a6a2dd88-cbaa-4497-9978-ddec403316a2-must-gather-output\") pod \"must-gather-w6qlt\" (UID: \"a6a2dd88-cbaa-4497-9978-ddec403316a2\") " pod="openshift-must-gather-przsk/must-gather-w6qlt" Feb 24 03:59:54 crc kubenswrapper[4923]: I0224 03:59:54.594098 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fb9m\" (UniqueName: \"kubernetes.io/projected/a6a2dd88-cbaa-4497-9978-ddec403316a2-kube-api-access-6fb9m\") pod \"must-gather-w6qlt\" (UID: \"a6a2dd88-cbaa-4497-9978-ddec403316a2\") " pod="openshift-must-gather-przsk/must-gather-w6qlt" Feb 24 03:59:54 crc kubenswrapper[4923]: I0224 03:59:54.695655 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a6a2dd88-cbaa-4497-9978-ddec403316a2-must-gather-output\") pod \"must-gather-w6qlt\" (UID: \"a6a2dd88-cbaa-4497-9978-ddec403316a2\") " pod="openshift-must-gather-przsk/must-gather-w6qlt" Feb 24 03:59:54 crc kubenswrapper[4923]: I0224 03:59:54.695833 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fb9m\" (UniqueName: \"kubernetes.io/projected/a6a2dd88-cbaa-4497-9978-ddec403316a2-kube-api-access-6fb9m\") pod \"must-gather-w6qlt\" (UID: \"a6a2dd88-cbaa-4497-9978-ddec403316a2\") " pod="openshift-must-gather-przsk/must-gather-w6qlt" Feb 24 03:59:54 crc kubenswrapper[4923]: I0224 03:59:54.696339 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a6a2dd88-cbaa-4497-9978-ddec403316a2-must-gather-output\") pod \"must-gather-w6qlt\" (UID: \"a6a2dd88-cbaa-4497-9978-ddec403316a2\") " pod="openshift-must-gather-przsk/must-gather-w6qlt" Feb 24 03:59:54 crc kubenswrapper[4923]: I0224 03:59:54.734936 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fb9m\" (UniqueName: \"kubernetes.io/projected/a6a2dd88-cbaa-4497-9978-ddec403316a2-kube-api-access-6fb9m\") pod \"must-gather-w6qlt\" (UID: \"a6a2dd88-cbaa-4497-9978-ddec403316a2\") " pod="openshift-must-gather-przsk/must-gather-w6qlt" Feb 24 03:59:54 crc kubenswrapper[4923]: I0224 03:59:54.795308 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-przsk/must-gather-w6qlt" Feb 24 03:59:55 crc kubenswrapper[4923]: I0224 03:59:55.277710 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-przsk/must-gather-w6qlt"] Feb 24 03:59:55 crc kubenswrapper[4923]: I0224 03:59:55.617105 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lbw7w" Feb 24 03:59:55 crc kubenswrapper[4923]: I0224 03:59:55.617452 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lbw7w" Feb 24 03:59:55 crc kubenswrapper[4923]: I0224 03:59:55.960483 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-przsk/must-gather-w6qlt" event={"ID":"a6a2dd88-cbaa-4497-9978-ddec403316a2","Type":"ContainerStarted","Data":"d0489b28e88b3e56edb268637bfe60f72375dbf5e698c6f1a84f375de0311bf7"} Feb 24 03:59:55 crc kubenswrapper[4923]: I0224 03:59:55.960525 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-przsk/must-gather-w6qlt" event={"ID":"a6a2dd88-cbaa-4497-9978-ddec403316a2","Type":"ContainerStarted","Data":"22098e4b20089e59beaa08bf74fc42d06b2fbc0b25a3e98da5b26288be3997ba"} Feb 24 03:59:55 crc kubenswrapper[4923]: I0224 03:59:55.960534 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-przsk/must-gather-w6qlt" event={"ID":"a6a2dd88-cbaa-4497-9978-ddec403316a2","Type":"ContainerStarted","Data":"c1e9f166e4bf913d0d4ae3da1b3f48fb1b5ca2653fae9ab9fb46bf1b745d4240"} Feb 24 03:59:55 crc kubenswrapper[4923]: I0224 03:59:55.981515 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-przsk/must-gather-w6qlt" podStartSLOduration=1.981487311 podStartE2EDuration="1.981487311s" podCreationTimestamp="2026-02-24 03:59:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 03:59:55.977523318 +0000 UTC m=+3919.994594161" watchObservedRunningTime="2026-02-24 03:59:55.981487311 +0000 UTC m=+3919.998558154" Feb 24 03:59:56 crc kubenswrapper[4923]: I0224 03:59:56.681563 4923 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lbw7w" podUID="b46fceca-15bf-4ca0-b015-7273e44de29a" containerName="registry-server" probeResult="failure" output=< Feb 24 03:59:56 crc kubenswrapper[4923]: timeout: failed to connect service ":50051" within 1s Feb 24 03:59:56 crc kubenswrapper[4923]: > Feb 24 03:59:59 crc kubenswrapper[4923]: I0224 03:59:59.270527 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-przsk/crc-debug-clb8f"] Feb 24 03:59:59 crc kubenswrapper[4923]: I0224 03:59:59.273043 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-przsk/crc-debug-clb8f" Feb 24 03:59:59 crc kubenswrapper[4923]: I0224 03:59:59.392590 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37037463-da34-4501-8f0d-2190ad5ae59f-host\") pod \"crc-debug-clb8f\" (UID: \"37037463-da34-4501-8f0d-2190ad5ae59f\") " pod="openshift-must-gather-przsk/crc-debug-clb8f" Feb 24 03:59:59 crc kubenswrapper[4923]: I0224 03:59:59.393022 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7nng\" (UniqueName: \"kubernetes.io/projected/37037463-da34-4501-8f0d-2190ad5ae59f-kube-api-access-f7nng\") pod \"crc-debug-clb8f\" (UID: \"37037463-da34-4501-8f0d-2190ad5ae59f\") " pod="openshift-must-gather-przsk/crc-debug-clb8f" Feb 24 03:59:59 crc kubenswrapper[4923]: I0224 03:59:59.494947 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7nng\" (UniqueName: \"kubernetes.io/projected/37037463-da34-4501-8f0d-2190ad5ae59f-kube-api-access-f7nng\") pod \"crc-debug-clb8f\" (UID: \"37037463-da34-4501-8f0d-2190ad5ae59f\") " pod="openshift-must-gather-przsk/crc-debug-clb8f" Feb 24 03:59:59 crc kubenswrapper[4923]: I0224 03:59:59.495021 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37037463-da34-4501-8f0d-2190ad5ae59f-host\") pod \"crc-debug-clb8f\" (UID: \"37037463-da34-4501-8f0d-2190ad5ae59f\") " pod="openshift-must-gather-przsk/crc-debug-clb8f" Feb 24 03:59:59 crc kubenswrapper[4923]: I0224 03:59:59.495111 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37037463-da34-4501-8f0d-2190ad5ae59f-host\") pod \"crc-debug-clb8f\" (UID: \"37037463-da34-4501-8f0d-2190ad5ae59f\") " pod="openshift-must-gather-przsk/crc-debug-clb8f" Feb 24 03:59:59 crc kubenswrapper[4923]: I0224 03:59:59.535359 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7nng\" (UniqueName: \"kubernetes.io/projected/37037463-da34-4501-8f0d-2190ad5ae59f-kube-api-access-f7nng\") pod \"crc-debug-clb8f\" (UID: \"37037463-da34-4501-8f0d-2190ad5ae59f\") " pod="openshift-must-gather-przsk/crc-debug-clb8f" Feb 24 03:59:59 crc kubenswrapper[4923]: I0224 03:59:59.593456 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-przsk/crc-debug-clb8f" Feb 24 04:00:00 crc kubenswrapper[4923]: I0224 04:00:00.005165 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-przsk/crc-debug-clb8f" event={"ID":"37037463-da34-4501-8f0d-2190ad5ae59f","Type":"ContainerStarted","Data":"879be6631b2a6c12745ac05722508ba07dd5cba052464abb0b8636c1f3fd5c9c"} Feb 24 04:00:00 crc kubenswrapper[4923]: I0224 04:00:00.234760 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5"] Feb 24 04:00:00 crc kubenswrapper[4923]: I0224 04:00:00.236260 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5" Feb 24 04:00:00 crc kubenswrapper[4923]: I0224 04:00:00.242342 4923 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 24 04:00:00 crc kubenswrapper[4923]: I0224 04:00:00.242434 4923 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 24 04:00:00 crc kubenswrapper[4923]: I0224 04:00:00.254178 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5"] Feb 24 04:00:00 crc kubenswrapper[4923]: I0224 04:00:00.313768 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e616521b-d0a5-44f5-b4a1-66793042fdc8-secret-volume\") pod \"collect-profiles-29531760-xdwc5\" (UID: \"e616521b-d0a5-44f5-b4a1-66793042fdc8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5" Feb 24 04:00:00 crc kubenswrapper[4923]: I0224 04:00:00.313868 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vzj9\" (UniqueName: \"kubernetes.io/projected/e616521b-d0a5-44f5-b4a1-66793042fdc8-kube-api-access-5vzj9\") pod \"collect-profiles-29531760-xdwc5\" (UID: \"e616521b-d0a5-44f5-b4a1-66793042fdc8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5" Feb 24 04:00:00 crc kubenswrapper[4923]: I0224 04:00:00.313928 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e616521b-d0a5-44f5-b4a1-66793042fdc8-config-volume\") pod \"collect-profiles-29531760-xdwc5\" (UID: \"e616521b-d0a5-44f5-b4a1-66793042fdc8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5" Feb 24 04:00:00 crc kubenswrapper[4923]: I0224 04:00:00.416258 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e616521b-d0a5-44f5-b4a1-66793042fdc8-secret-volume\") pod \"collect-profiles-29531760-xdwc5\" (UID: \"e616521b-d0a5-44f5-b4a1-66793042fdc8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5" Feb 24 04:00:00 crc kubenswrapper[4923]: I0224 04:00:00.416730 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vzj9\" (UniqueName: \"kubernetes.io/projected/e616521b-d0a5-44f5-b4a1-66793042fdc8-kube-api-access-5vzj9\") pod \"collect-profiles-29531760-xdwc5\" (UID: \"e616521b-d0a5-44f5-b4a1-66793042fdc8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5" Feb 24 04:00:00 crc kubenswrapper[4923]: I0224 04:00:00.416790 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e616521b-d0a5-44f5-b4a1-66793042fdc8-config-volume\") pod \"collect-profiles-29531760-xdwc5\" (UID: \"e616521b-d0a5-44f5-b4a1-66793042fdc8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5" Feb 24 04:00:00 crc kubenswrapper[4923]: I0224 04:00:00.417874 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e616521b-d0a5-44f5-b4a1-66793042fdc8-config-volume\") pod \"collect-profiles-29531760-xdwc5\" (UID: \"e616521b-d0a5-44f5-b4a1-66793042fdc8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5" Feb 24 04:00:00 crc kubenswrapper[4923]: I0224 04:00:00.423501 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e616521b-d0a5-44f5-b4a1-66793042fdc8-secret-volume\") pod \"collect-profiles-29531760-xdwc5\" (UID: \"e616521b-d0a5-44f5-b4a1-66793042fdc8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5" Feb 24 04:00:00 crc kubenswrapper[4923]: I0224 04:00:00.432841 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vzj9\" (UniqueName: \"kubernetes.io/projected/e616521b-d0a5-44f5-b4a1-66793042fdc8-kube-api-access-5vzj9\") pod \"collect-profiles-29531760-xdwc5\" (UID: \"e616521b-d0a5-44f5-b4a1-66793042fdc8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5" Feb 24 04:00:00 crc kubenswrapper[4923]: I0224 04:00:00.623509 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5" Feb 24 04:00:01 crc kubenswrapper[4923]: I0224 04:00:01.014453 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-przsk/crc-debug-clb8f" event={"ID":"37037463-da34-4501-8f0d-2190ad5ae59f","Type":"ContainerStarted","Data":"55371fa78305f9a6466e15d46a9b35dc6a5986eb2eeb46ad4a9294c2f2a67914"} Feb 24 04:00:01 crc kubenswrapper[4923]: I0224 04:00:01.028924 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-przsk/crc-debug-clb8f" podStartSLOduration=2.028905356 podStartE2EDuration="2.028905356s" podCreationTimestamp="2026-02-24 03:59:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 04:00:01.028359532 +0000 UTC m=+3925.045430345" watchObservedRunningTime="2026-02-24 04:00:01.028905356 +0000 UTC m=+3925.045976169" Feb 24 04:00:01 crc kubenswrapper[4923]: W0224 04:00:01.097247 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode616521b_d0a5_44f5_b4a1_66793042fdc8.slice/crio-9c40a65bab8e9420674706405f7498dca4d0901bc0699ea41c57cd4ecdbf6182 WatchSource:0}: Error finding container 9c40a65bab8e9420674706405f7498dca4d0901bc0699ea41c57cd4ecdbf6182: Status 404 returned error can't find the container with id 9c40a65bab8e9420674706405f7498dca4d0901bc0699ea41c57cd4ecdbf6182 Feb 24 04:00:01 crc kubenswrapper[4923]: I0224 04:00:01.099640 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5"] Feb 24 04:00:02 crc kubenswrapper[4923]: I0224 04:00:02.037091 4923 generic.go:334] "Generic (PLEG): container finished" podID="e616521b-d0a5-44f5-b4a1-66793042fdc8" containerID="badec08c2b400bca0dc0acda937a0653c4fe756618e8981773eef8e420b72a06" exitCode=0 Feb 24 04:00:02 crc kubenswrapper[4923]: I0224 04:00:02.039304 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5" event={"ID":"e616521b-d0a5-44f5-b4a1-66793042fdc8","Type":"ContainerDied","Data":"badec08c2b400bca0dc0acda937a0653c4fe756618e8981773eef8e420b72a06"} Feb 24 04:00:02 crc kubenswrapper[4923]: I0224 04:00:02.039337 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5" event={"ID":"e616521b-d0a5-44f5-b4a1-66793042fdc8","Type":"ContainerStarted","Data":"9c40a65bab8e9420674706405f7498dca4d0901bc0699ea41c57cd4ecdbf6182"} Feb 24 04:00:03 crc kubenswrapper[4923]: I0224 04:00:03.457084 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5" Feb 24 04:00:03 crc kubenswrapper[4923]: I0224 04:00:03.592817 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vzj9\" (UniqueName: \"kubernetes.io/projected/e616521b-d0a5-44f5-b4a1-66793042fdc8-kube-api-access-5vzj9\") pod \"e616521b-d0a5-44f5-b4a1-66793042fdc8\" (UID: \"e616521b-d0a5-44f5-b4a1-66793042fdc8\") " Feb 24 04:00:03 crc kubenswrapper[4923]: I0224 04:00:03.592925 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e616521b-d0a5-44f5-b4a1-66793042fdc8-secret-volume\") pod \"e616521b-d0a5-44f5-b4a1-66793042fdc8\" (UID: \"e616521b-d0a5-44f5-b4a1-66793042fdc8\") " Feb 24 04:00:03 crc kubenswrapper[4923]: I0224 04:00:03.593086 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e616521b-d0a5-44f5-b4a1-66793042fdc8-config-volume\") pod \"e616521b-d0a5-44f5-b4a1-66793042fdc8\" (UID: \"e616521b-d0a5-44f5-b4a1-66793042fdc8\") " Feb 24 04:00:03 crc kubenswrapper[4923]: I0224 04:00:03.593650 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e616521b-d0a5-44f5-b4a1-66793042fdc8-config-volume" (OuterVolumeSpecName: "config-volume") pod "e616521b-d0a5-44f5-b4a1-66793042fdc8" (UID: "e616521b-d0a5-44f5-b4a1-66793042fdc8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 04:00:03 crc kubenswrapper[4923]: I0224 04:00:03.594041 4923 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e616521b-d0a5-44f5-b4a1-66793042fdc8-config-volume\") on node \"crc\" DevicePath \"\"" Feb 24 04:00:03 crc kubenswrapper[4923]: I0224 04:00:03.599127 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e616521b-d0a5-44f5-b4a1-66793042fdc8-kube-api-access-5vzj9" (OuterVolumeSpecName: "kube-api-access-5vzj9") pod "e616521b-d0a5-44f5-b4a1-66793042fdc8" (UID: "e616521b-d0a5-44f5-b4a1-66793042fdc8"). InnerVolumeSpecName "kube-api-access-5vzj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 04:00:03 crc kubenswrapper[4923]: I0224 04:00:03.613530 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e616521b-d0a5-44f5-b4a1-66793042fdc8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e616521b-d0a5-44f5-b4a1-66793042fdc8" (UID: "e616521b-d0a5-44f5-b4a1-66793042fdc8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 04:00:03 crc kubenswrapper[4923]: I0224 04:00:03.695557 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vzj9\" (UniqueName: \"kubernetes.io/projected/e616521b-d0a5-44f5-b4a1-66793042fdc8-kube-api-access-5vzj9\") on node \"crc\" DevicePath \"\"" Feb 24 04:00:03 crc kubenswrapper[4923]: I0224 04:00:03.695586 4923 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e616521b-d0a5-44f5-b4a1-66793042fdc8-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 24 04:00:04 crc kubenswrapper[4923]: I0224 04:00:04.060132 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5" event={"ID":"e616521b-d0a5-44f5-b4a1-66793042fdc8","Type":"ContainerDied","Data":"9c40a65bab8e9420674706405f7498dca4d0901bc0699ea41c57cd4ecdbf6182"} Feb 24 04:00:04 crc kubenswrapper[4923]: I0224 04:00:04.060177 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c40a65bab8e9420674706405f7498dca4d0901bc0699ea41c57cd4ecdbf6182" Feb 24 04:00:04 crc kubenswrapper[4923]: I0224 04:00:04.060188 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29531760-xdwc5" Feb 24 04:00:04 crc kubenswrapper[4923]: I0224 04:00:04.541952 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4"] Feb 24 04:00:04 crc kubenswrapper[4923]: I0224 04:00:04.552155 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29531715-r92t4"] Feb 24 04:00:05 crc kubenswrapper[4923]: I0224 04:00:05.662136 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lbw7w" Feb 24 04:00:05 crc kubenswrapper[4923]: I0224 04:00:05.721855 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8812730-dcd0-44d4-a795-256a1c1810e4" path="/var/lib/kubelet/pods/e8812730-dcd0-44d4-a795-256a1c1810e4/volumes" Feb 24 04:00:05 crc kubenswrapper[4923]: I0224 04:00:05.722645 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lbw7w" Feb 24 04:00:05 crc kubenswrapper[4923]: I0224 04:00:05.897247 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lbw7w"] Feb 24 04:00:07 crc kubenswrapper[4923]: I0224 04:00:07.082487 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lbw7w" podUID="b46fceca-15bf-4ca0-b015-7273e44de29a" containerName="registry-server" containerID="cri-o://8e98c43495be7af3ab17ccb797cf9b7f03295d99ae2a28eb4b18810bbf30b04b" gracePeriod=2 Feb 24 04:00:07 crc kubenswrapper[4923]: I0224 04:00:07.605311 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lbw7w" Feb 24 04:00:07 crc kubenswrapper[4923]: I0224 04:00:07.684523 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvrln\" (UniqueName: \"kubernetes.io/projected/b46fceca-15bf-4ca0-b015-7273e44de29a-kube-api-access-rvrln\") pod \"b46fceca-15bf-4ca0-b015-7273e44de29a\" (UID: \"b46fceca-15bf-4ca0-b015-7273e44de29a\") " Feb 24 04:00:07 crc kubenswrapper[4923]: I0224 04:00:07.684624 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b46fceca-15bf-4ca0-b015-7273e44de29a-utilities\") pod \"b46fceca-15bf-4ca0-b015-7273e44de29a\" (UID: \"b46fceca-15bf-4ca0-b015-7273e44de29a\") " Feb 24 04:00:07 crc kubenswrapper[4923]: I0224 04:00:07.684721 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b46fceca-15bf-4ca0-b015-7273e44de29a-catalog-content\") pod \"b46fceca-15bf-4ca0-b015-7273e44de29a\" (UID: \"b46fceca-15bf-4ca0-b015-7273e44de29a\") " Feb 24 04:00:07 crc kubenswrapper[4923]: I0224 04:00:07.694668 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b46fceca-15bf-4ca0-b015-7273e44de29a-kube-api-access-rvrln" (OuterVolumeSpecName: "kube-api-access-rvrln") pod "b46fceca-15bf-4ca0-b015-7273e44de29a" (UID: "b46fceca-15bf-4ca0-b015-7273e44de29a"). InnerVolumeSpecName "kube-api-access-rvrln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 04:00:07 crc kubenswrapper[4923]: I0224 04:00:07.700801 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b46fceca-15bf-4ca0-b015-7273e44de29a-utilities" (OuterVolumeSpecName: "utilities") pod "b46fceca-15bf-4ca0-b015-7273e44de29a" (UID: "b46fceca-15bf-4ca0-b015-7273e44de29a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 04:00:07 crc kubenswrapper[4923]: I0224 04:00:07.789389 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvrln\" (UniqueName: \"kubernetes.io/projected/b46fceca-15bf-4ca0-b015-7273e44de29a-kube-api-access-rvrln\") on node \"crc\" DevicePath \"\"" Feb 24 04:00:07 crc kubenswrapper[4923]: I0224 04:00:07.789424 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b46fceca-15bf-4ca0-b015-7273e44de29a-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 04:00:07 crc kubenswrapper[4923]: I0224 04:00:07.829660 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b46fceca-15bf-4ca0-b015-7273e44de29a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b46fceca-15bf-4ca0-b015-7273e44de29a" (UID: "b46fceca-15bf-4ca0-b015-7273e44de29a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 04:00:07 crc kubenswrapper[4923]: I0224 04:00:07.891191 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b46fceca-15bf-4ca0-b015-7273e44de29a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 04:00:08 crc kubenswrapper[4923]: I0224 04:00:08.095710 4923 generic.go:334] "Generic (PLEG): container finished" podID="b46fceca-15bf-4ca0-b015-7273e44de29a" containerID="8e98c43495be7af3ab17ccb797cf9b7f03295d99ae2a28eb4b18810bbf30b04b" exitCode=0 Feb 24 04:00:08 crc kubenswrapper[4923]: I0224 04:00:08.095750 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbw7w" event={"ID":"b46fceca-15bf-4ca0-b015-7273e44de29a","Type":"ContainerDied","Data":"8e98c43495be7af3ab17ccb797cf9b7f03295d99ae2a28eb4b18810bbf30b04b"} Feb 24 04:00:08 crc kubenswrapper[4923]: I0224 04:00:08.095780 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lbw7w" event={"ID":"b46fceca-15bf-4ca0-b015-7273e44de29a","Type":"ContainerDied","Data":"0c2bc7daaf95c896a25da1cbf1e4639a478408c4ed85e189e90e8873c897c3a0"} Feb 24 04:00:08 crc kubenswrapper[4923]: I0224 04:00:08.095783 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lbw7w" Feb 24 04:00:08 crc kubenswrapper[4923]: I0224 04:00:08.095796 4923 scope.go:117] "RemoveContainer" containerID="8e98c43495be7af3ab17ccb797cf9b7f03295d99ae2a28eb4b18810bbf30b04b" Feb 24 04:00:08 crc kubenswrapper[4923]: I0224 04:00:08.120269 4923 scope.go:117] "RemoveContainer" containerID="4461a39d8875f1add0cd2c1363f5137173950d818e85d4145be0fbc16d65198a" Feb 24 04:00:08 crc kubenswrapper[4923]: I0224 04:00:08.135151 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lbw7w"] Feb 24 04:00:08 crc kubenswrapper[4923]: I0224 04:00:08.142522 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lbw7w"] Feb 24 04:00:08 crc kubenswrapper[4923]: I0224 04:00:08.148219 4923 scope.go:117] "RemoveContainer" containerID="86e0a6e4ff4f78e4c5591b30a831444064ff5985509041a5b2441a2679a5d347" Feb 24 04:00:08 crc kubenswrapper[4923]: I0224 04:00:08.184904 4923 scope.go:117] "RemoveContainer" containerID="8e98c43495be7af3ab17ccb797cf9b7f03295d99ae2a28eb4b18810bbf30b04b" Feb 24 04:00:08 crc kubenswrapper[4923]: E0224 04:00:08.185702 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e98c43495be7af3ab17ccb797cf9b7f03295d99ae2a28eb4b18810bbf30b04b\": container with ID starting with 8e98c43495be7af3ab17ccb797cf9b7f03295d99ae2a28eb4b18810bbf30b04b not found: ID does not exist" containerID="8e98c43495be7af3ab17ccb797cf9b7f03295d99ae2a28eb4b18810bbf30b04b" Feb 24 04:00:08 crc kubenswrapper[4923]: I0224 04:00:08.185748 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e98c43495be7af3ab17ccb797cf9b7f03295d99ae2a28eb4b18810bbf30b04b"} err="failed to get container status \"8e98c43495be7af3ab17ccb797cf9b7f03295d99ae2a28eb4b18810bbf30b04b\": rpc error: code = NotFound desc = could not find container \"8e98c43495be7af3ab17ccb797cf9b7f03295d99ae2a28eb4b18810bbf30b04b\": container with ID starting with 8e98c43495be7af3ab17ccb797cf9b7f03295d99ae2a28eb4b18810bbf30b04b not found: ID does not exist" Feb 24 04:00:08 crc kubenswrapper[4923]: I0224 04:00:08.185773 4923 scope.go:117] "RemoveContainer" containerID="4461a39d8875f1add0cd2c1363f5137173950d818e85d4145be0fbc16d65198a" Feb 24 04:00:08 crc kubenswrapper[4923]: E0224 04:00:08.186244 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4461a39d8875f1add0cd2c1363f5137173950d818e85d4145be0fbc16d65198a\": container with ID starting with 4461a39d8875f1add0cd2c1363f5137173950d818e85d4145be0fbc16d65198a not found: ID does not exist" containerID="4461a39d8875f1add0cd2c1363f5137173950d818e85d4145be0fbc16d65198a" Feb 24 04:00:08 crc kubenswrapper[4923]: I0224 04:00:08.186276 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4461a39d8875f1add0cd2c1363f5137173950d818e85d4145be0fbc16d65198a"} err="failed to get container status \"4461a39d8875f1add0cd2c1363f5137173950d818e85d4145be0fbc16d65198a\": rpc error: code = NotFound desc = could not find container \"4461a39d8875f1add0cd2c1363f5137173950d818e85d4145be0fbc16d65198a\": container with ID starting with 4461a39d8875f1add0cd2c1363f5137173950d818e85d4145be0fbc16d65198a not found: ID does not exist" Feb 24 04:00:08 crc kubenswrapper[4923]: I0224 04:00:08.186320 4923 scope.go:117] "RemoveContainer" containerID="86e0a6e4ff4f78e4c5591b30a831444064ff5985509041a5b2441a2679a5d347" Feb 24 04:00:08 crc kubenswrapper[4923]: E0224 04:00:08.186691 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86e0a6e4ff4f78e4c5591b30a831444064ff5985509041a5b2441a2679a5d347\": container with ID starting with 86e0a6e4ff4f78e4c5591b30a831444064ff5985509041a5b2441a2679a5d347 not found: ID does not exist" containerID="86e0a6e4ff4f78e4c5591b30a831444064ff5985509041a5b2441a2679a5d347" Feb 24 04:00:08 crc kubenswrapper[4923]: I0224 04:00:08.186735 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86e0a6e4ff4f78e4c5591b30a831444064ff5985509041a5b2441a2679a5d347"} err="failed to get container status \"86e0a6e4ff4f78e4c5591b30a831444064ff5985509041a5b2441a2679a5d347\": rpc error: code = NotFound desc = could not find container \"86e0a6e4ff4f78e4c5591b30a831444064ff5985509041a5b2441a2679a5d347\": container with ID starting with 86e0a6e4ff4f78e4c5591b30a831444064ff5985509041a5b2441a2679a5d347 not found: ID does not exist" Feb 24 04:00:09 crc kubenswrapper[4923]: I0224 04:00:09.724024 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b46fceca-15bf-4ca0-b015-7273e44de29a" path="/var/lib/kubelet/pods/b46fceca-15bf-4ca0-b015-7273e44de29a/volumes" Feb 24 04:00:19 crc kubenswrapper[4923]: I0224 04:00:19.916136 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 04:00:19 crc kubenswrapper[4923]: I0224 04:00:19.916697 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 04:00:19 crc kubenswrapper[4923]: I0224 04:00:19.916732 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 04:00:19 crc kubenswrapper[4923]: I0224 04:00:19.917475 4923 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3"} pod="openshift-machine-config-operator/machine-config-daemon-rh26t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 04:00:19 crc kubenswrapper[4923]: I0224 04:00:19.917521 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" containerID="cri-o://4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" gracePeriod=600 Feb 24 04:00:20 crc kubenswrapper[4923]: E0224 04:00:20.046354 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:00:20 crc kubenswrapper[4923]: I0224 04:00:20.208293 4923 generic.go:334] "Generic (PLEG): container finished" podID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" exitCode=0 Feb 24 04:00:20 crc kubenswrapper[4923]: I0224 04:00:20.208329 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerDied","Data":"4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3"} Feb 24 04:00:20 crc kubenswrapper[4923]: I0224 04:00:20.208390 4923 scope.go:117] "RemoveContainer" containerID="9fe9e45e3a82c3754daa7fe6ed42ef47bf61b7c7019ca6a9f39fabf754ac5291" Feb 24 04:00:20 crc kubenswrapper[4923]: I0224 04:00:20.208983 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:00:20 crc kubenswrapper[4923]: E0224 04:00:20.209225 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.131594 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-smr6t"] Feb 24 04:00:24 crc kubenswrapper[4923]: E0224 04:00:24.132642 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46fceca-15bf-4ca0-b015-7273e44de29a" containerName="registry-server" Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.132656 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46fceca-15bf-4ca0-b015-7273e44de29a" containerName="registry-server" Feb 24 04:00:24 crc kubenswrapper[4923]: E0224 04:00:24.132678 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46fceca-15bf-4ca0-b015-7273e44de29a" containerName="extract-utilities" Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.132686 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46fceca-15bf-4ca0-b015-7273e44de29a" containerName="extract-utilities" Feb 24 04:00:24 crc kubenswrapper[4923]: E0224 04:00:24.132702 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e616521b-d0a5-44f5-b4a1-66793042fdc8" containerName="collect-profiles" Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.132709 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="e616521b-d0a5-44f5-b4a1-66793042fdc8" containerName="collect-profiles" Feb 24 04:00:24 crc kubenswrapper[4923]: E0224 04:00:24.132735 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46fceca-15bf-4ca0-b015-7273e44de29a" containerName="extract-content" Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.132752 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46fceca-15bf-4ca0-b015-7273e44de29a" containerName="extract-content" Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.132965 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="e616521b-d0a5-44f5-b4a1-66793042fdc8" containerName="collect-profiles" Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.132979 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="b46fceca-15bf-4ca0-b015-7273e44de29a" containerName="registry-server" Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.134537 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-smr6t" Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.143778 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-smr6t"] Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.240469 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/429930fb-969e-4d0d-bcae-7eb35eb299ea-utilities\") pod \"certified-operators-smr6t\" (UID: \"429930fb-969e-4d0d-bcae-7eb35eb299ea\") " pod="openshift-marketplace/certified-operators-smr6t" Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.240511 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/429930fb-969e-4d0d-bcae-7eb35eb299ea-catalog-content\") pod \"certified-operators-smr6t\" (UID: \"429930fb-969e-4d0d-bcae-7eb35eb299ea\") " pod="openshift-marketplace/certified-operators-smr6t" Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.240861 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hhm5\" (UniqueName: \"kubernetes.io/projected/429930fb-969e-4d0d-bcae-7eb35eb299ea-kube-api-access-9hhm5\") pod \"certified-operators-smr6t\" (UID: \"429930fb-969e-4d0d-bcae-7eb35eb299ea\") " pod="openshift-marketplace/certified-operators-smr6t" Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.342520 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hhm5\" (UniqueName: \"kubernetes.io/projected/429930fb-969e-4d0d-bcae-7eb35eb299ea-kube-api-access-9hhm5\") pod \"certified-operators-smr6t\" (UID: \"429930fb-969e-4d0d-bcae-7eb35eb299ea\") " pod="openshift-marketplace/certified-operators-smr6t" Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.342621 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/429930fb-969e-4d0d-bcae-7eb35eb299ea-utilities\") pod \"certified-operators-smr6t\" (UID: \"429930fb-969e-4d0d-bcae-7eb35eb299ea\") " pod="openshift-marketplace/certified-operators-smr6t" Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.342640 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/429930fb-969e-4d0d-bcae-7eb35eb299ea-catalog-content\") pod \"certified-operators-smr6t\" (UID: \"429930fb-969e-4d0d-bcae-7eb35eb299ea\") " pod="openshift-marketplace/certified-operators-smr6t" Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.343100 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/429930fb-969e-4d0d-bcae-7eb35eb299ea-utilities\") pod \"certified-operators-smr6t\" (UID: \"429930fb-969e-4d0d-bcae-7eb35eb299ea\") " pod="openshift-marketplace/certified-operators-smr6t" Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.343131 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/429930fb-969e-4d0d-bcae-7eb35eb299ea-catalog-content\") pod \"certified-operators-smr6t\" (UID: \"429930fb-969e-4d0d-bcae-7eb35eb299ea\") " pod="openshift-marketplace/certified-operators-smr6t" Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.359850 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hhm5\" (UniqueName: \"kubernetes.io/projected/429930fb-969e-4d0d-bcae-7eb35eb299ea-kube-api-access-9hhm5\") pod \"certified-operators-smr6t\" (UID: \"429930fb-969e-4d0d-bcae-7eb35eb299ea\") " pod="openshift-marketplace/certified-operators-smr6t" Feb 24 04:00:24 crc kubenswrapper[4923]: I0224 04:00:24.459484 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-smr6t" Feb 24 04:00:25 crc kubenswrapper[4923]: I0224 04:00:25.056087 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-smr6t"] Feb 24 04:00:25 crc kubenswrapper[4923]: I0224 04:00:25.273810 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smr6t" event={"ID":"429930fb-969e-4d0d-bcae-7eb35eb299ea","Type":"ContainerStarted","Data":"5351c5e4efab58655c45b4676976fc2dba72cdd8ce70b16ed40e2fe617d99d12"} Feb 24 04:00:25 crc kubenswrapper[4923]: I0224 04:00:25.275030 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smr6t" event={"ID":"429930fb-969e-4d0d-bcae-7eb35eb299ea","Type":"ContainerStarted","Data":"6e301ff486a924b43044d2f175aa321c0247fe5f69611fab2f4daf5f4e56ee8a"} Feb 24 04:00:26 crc kubenswrapper[4923]: I0224 04:00:26.288717 4923 generic.go:334] "Generic (PLEG): container finished" podID="429930fb-969e-4d0d-bcae-7eb35eb299ea" containerID="5351c5e4efab58655c45b4676976fc2dba72cdd8ce70b16ed40e2fe617d99d12" exitCode=0 Feb 24 04:00:26 crc kubenswrapper[4923]: I0224 04:00:26.288758 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smr6t" event={"ID":"429930fb-969e-4d0d-bcae-7eb35eb299ea","Type":"ContainerDied","Data":"5351c5e4efab58655c45b4676976fc2dba72cdd8ce70b16ed40e2fe617d99d12"} Feb 24 04:00:27 crc kubenswrapper[4923]: I0224 04:00:27.298068 4923 generic.go:334] "Generic (PLEG): container finished" podID="429930fb-969e-4d0d-bcae-7eb35eb299ea" containerID="87c5b842a208436a6859180ed51c2d2931a3de536369e7cd18e13885c0a661fe" exitCode=0 Feb 24 04:00:27 crc kubenswrapper[4923]: I0224 04:00:27.298128 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smr6t" event={"ID":"429930fb-969e-4d0d-bcae-7eb35eb299ea","Type":"ContainerDied","Data":"87c5b842a208436a6859180ed51c2d2931a3de536369e7cd18e13885c0a661fe"} Feb 24 04:00:28 crc kubenswrapper[4923]: I0224 04:00:28.309642 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smr6t" event={"ID":"429930fb-969e-4d0d-bcae-7eb35eb299ea","Type":"ContainerStarted","Data":"9946520a47155ae2165b205d196f1c9c826169f5afd2a73ab190c57655d48bc7"} Feb 24 04:00:28 crc kubenswrapper[4923]: I0224 04:00:28.338999 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-smr6t" podStartSLOduration=1.883249658 podStartE2EDuration="4.338981662s" podCreationTimestamp="2026-02-24 04:00:24 +0000 UTC" firstStartedPulling="2026-02-24 04:00:25.277137242 +0000 UTC m=+3949.294208045" lastFinishedPulling="2026-02-24 04:00:27.732869236 +0000 UTC m=+3951.749940049" observedRunningTime="2026-02-24 04:00:28.329529605 +0000 UTC m=+3952.346600418" watchObservedRunningTime="2026-02-24 04:00:28.338981662 +0000 UTC m=+3952.356052475" Feb 24 04:00:34 crc kubenswrapper[4923]: I0224 04:00:34.460760 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-smr6t" Feb 24 04:00:34 crc kubenswrapper[4923]: I0224 04:00:34.461255 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-smr6t" Feb 24 04:00:34 crc kubenswrapper[4923]: I0224 04:00:34.514859 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-smr6t" Feb 24 04:00:35 crc kubenswrapper[4923]: I0224 04:00:35.425043 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-smr6t" Feb 24 04:00:35 crc kubenswrapper[4923]: I0224 04:00:35.473948 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-smr6t"] Feb 24 04:00:35 crc kubenswrapper[4923]: I0224 04:00:35.713699 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:00:35 crc kubenswrapper[4923]: E0224 04:00:35.714289 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:00:37 crc kubenswrapper[4923]: I0224 04:00:37.385352 4923 generic.go:334] "Generic (PLEG): container finished" podID="37037463-da34-4501-8f0d-2190ad5ae59f" containerID="55371fa78305f9a6466e15d46a9b35dc6a5986eb2eeb46ad4a9294c2f2a67914" exitCode=0 Feb 24 04:00:37 crc kubenswrapper[4923]: I0224 04:00:37.385465 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-przsk/crc-debug-clb8f" event={"ID":"37037463-da34-4501-8f0d-2190ad5ae59f","Type":"ContainerDied","Data":"55371fa78305f9a6466e15d46a9b35dc6a5986eb2eeb46ad4a9294c2f2a67914"} Feb 24 04:00:37 crc kubenswrapper[4923]: I0224 04:00:37.385940 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-smr6t" podUID="429930fb-969e-4d0d-bcae-7eb35eb299ea" containerName="registry-server" containerID="cri-o://9946520a47155ae2165b205d196f1c9c826169f5afd2a73ab190c57655d48bc7" gracePeriod=2 Feb 24 04:00:37 crc kubenswrapper[4923]: I0224 04:00:37.948526 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-smr6t" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.007585 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hhm5\" (UniqueName: \"kubernetes.io/projected/429930fb-969e-4d0d-bcae-7eb35eb299ea-kube-api-access-9hhm5\") pod \"429930fb-969e-4d0d-bcae-7eb35eb299ea\" (UID: \"429930fb-969e-4d0d-bcae-7eb35eb299ea\") " Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.007648 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/429930fb-969e-4d0d-bcae-7eb35eb299ea-catalog-content\") pod \"429930fb-969e-4d0d-bcae-7eb35eb299ea\" (UID: \"429930fb-969e-4d0d-bcae-7eb35eb299ea\") " Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.007687 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/429930fb-969e-4d0d-bcae-7eb35eb299ea-utilities\") pod \"429930fb-969e-4d0d-bcae-7eb35eb299ea\" (UID: \"429930fb-969e-4d0d-bcae-7eb35eb299ea\") " Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.008798 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/429930fb-969e-4d0d-bcae-7eb35eb299ea-utilities" (OuterVolumeSpecName: "utilities") pod "429930fb-969e-4d0d-bcae-7eb35eb299ea" (UID: "429930fb-969e-4d0d-bcae-7eb35eb299ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.026455 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/429930fb-969e-4d0d-bcae-7eb35eb299ea-kube-api-access-9hhm5" (OuterVolumeSpecName: "kube-api-access-9hhm5") pod "429930fb-969e-4d0d-bcae-7eb35eb299ea" (UID: "429930fb-969e-4d0d-bcae-7eb35eb299ea"). InnerVolumeSpecName "kube-api-access-9hhm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.072023 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/429930fb-969e-4d0d-bcae-7eb35eb299ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "429930fb-969e-4d0d-bcae-7eb35eb299ea" (UID: "429930fb-969e-4d0d-bcae-7eb35eb299ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.110058 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/429930fb-969e-4d0d-bcae-7eb35eb299ea-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.110085 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/429930fb-969e-4d0d-bcae-7eb35eb299ea-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.110095 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hhm5\" (UniqueName: \"kubernetes.io/projected/429930fb-969e-4d0d-bcae-7eb35eb299ea-kube-api-access-9hhm5\") on node \"crc\" DevicePath \"\"" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.395408 4923 generic.go:334] "Generic (PLEG): container finished" podID="429930fb-969e-4d0d-bcae-7eb35eb299ea" containerID="9946520a47155ae2165b205d196f1c9c826169f5afd2a73ab190c57655d48bc7" exitCode=0 Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.395620 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smr6t" event={"ID":"429930fb-969e-4d0d-bcae-7eb35eb299ea","Type":"ContainerDied","Data":"9946520a47155ae2165b205d196f1c9c826169f5afd2a73ab190c57655d48bc7"} Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.395683 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-smr6t" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.395755 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-smr6t" event={"ID":"429930fb-969e-4d0d-bcae-7eb35eb299ea","Type":"ContainerDied","Data":"6e301ff486a924b43044d2f175aa321c0247fe5f69611fab2f4daf5f4e56ee8a"} Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.395793 4923 scope.go:117] "RemoveContainer" containerID="9946520a47155ae2165b205d196f1c9c826169f5afd2a73ab190c57655d48bc7" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.470217 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-przsk/crc-debug-clb8f" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.470866 4923 scope.go:117] "RemoveContainer" containerID="87c5b842a208436a6859180ed51c2d2931a3de536369e7cd18e13885c0a661fe" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.502139 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-smr6t"] Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.504462 4923 scope.go:117] "RemoveContainer" containerID="5351c5e4efab58655c45b4676976fc2dba72cdd8ce70b16ed40e2fe617d99d12" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.525496 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-smr6t"] Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.583630 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-przsk/crc-debug-clb8f"] Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.606464 4923 scope.go:117] "RemoveContainer" containerID="9946520a47155ae2165b205d196f1c9c826169f5afd2a73ab190c57655d48bc7" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.612931 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-przsk/crc-debug-clb8f"] Feb 24 04:00:38 crc kubenswrapper[4923]: E0224 04:00:38.616444 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9946520a47155ae2165b205d196f1c9c826169f5afd2a73ab190c57655d48bc7\": container with ID starting with 9946520a47155ae2165b205d196f1c9c826169f5afd2a73ab190c57655d48bc7 not found: ID does not exist" containerID="9946520a47155ae2165b205d196f1c9c826169f5afd2a73ab190c57655d48bc7" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.616493 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9946520a47155ae2165b205d196f1c9c826169f5afd2a73ab190c57655d48bc7"} err="failed to get container status \"9946520a47155ae2165b205d196f1c9c826169f5afd2a73ab190c57655d48bc7\": rpc error: code = NotFound desc = could not find container \"9946520a47155ae2165b205d196f1c9c826169f5afd2a73ab190c57655d48bc7\": container with ID starting with 9946520a47155ae2165b205d196f1c9c826169f5afd2a73ab190c57655d48bc7 not found: ID does not exist" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.616519 4923 scope.go:117] "RemoveContainer" containerID="87c5b842a208436a6859180ed51c2d2931a3de536369e7cd18e13885c0a661fe" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.622048 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37037463-da34-4501-8f0d-2190ad5ae59f-host\") pod \"37037463-da34-4501-8f0d-2190ad5ae59f\" (UID: \"37037463-da34-4501-8f0d-2190ad5ae59f\") " Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.622192 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7nng\" (UniqueName: \"kubernetes.io/projected/37037463-da34-4501-8f0d-2190ad5ae59f-kube-api-access-f7nng\") pod \"37037463-da34-4501-8f0d-2190ad5ae59f\" (UID: \"37037463-da34-4501-8f0d-2190ad5ae59f\") " Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.625430 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/37037463-da34-4501-8f0d-2190ad5ae59f-host" (OuterVolumeSpecName: "host") pod "37037463-da34-4501-8f0d-2190ad5ae59f" (UID: "37037463-da34-4501-8f0d-2190ad5ae59f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 04:00:38 crc kubenswrapper[4923]: E0224 04:00:38.625435 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87c5b842a208436a6859180ed51c2d2931a3de536369e7cd18e13885c0a661fe\": container with ID starting with 87c5b842a208436a6859180ed51c2d2931a3de536369e7cd18e13885c0a661fe not found: ID does not exist" containerID="87c5b842a208436a6859180ed51c2d2931a3de536369e7cd18e13885c0a661fe" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.625505 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c5b842a208436a6859180ed51c2d2931a3de536369e7cd18e13885c0a661fe"} err="failed to get container status \"87c5b842a208436a6859180ed51c2d2931a3de536369e7cd18e13885c0a661fe\": rpc error: code = NotFound desc = could not find container \"87c5b842a208436a6859180ed51c2d2931a3de536369e7cd18e13885c0a661fe\": container with ID starting with 87c5b842a208436a6859180ed51c2d2931a3de536369e7cd18e13885c0a661fe not found: ID does not exist" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.625538 4923 scope.go:117] "RemoveContainer" containerID="5351c5e4efab58655c45b4676976fc2dba72cdd8ce70b16ed40e2fe617d99d12" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.634523 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37037463-da34-4501-8f0d-2190ad5ae59f-kube-api-access-f7nng" (OuterVolumeSpecName: "kube-api-access-f7nng") pod "37037463-da34-4501-8f0d-2190ad5ae59f" (UID: "37037463-da34-4501-8f0d-2190ad5ae59f"). InnerVolumeSpecName "kube-api-access-f7nng". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 04:00:38 crc kubenswrapper[4923]: E0224 04:00:38.641608 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5351c5e4efab58655c45b4676976fc2dba72cdd8ce70b16ed40e2fe617d99d12\": container with ID starting with 5351c5e4efab58655c45b4676976fc2dba72cdd8ce70b16ed40e2fe617d99d12 not found: ID does not exist" containerID="5351c5e4efab58655c45b4676976fc2dba72cdd8ce70b16ed40e2fe617d99d12" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.641650 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5351c5e4efab58655c45b4676976fc2dba72cdd8ce70b16ed40e2fe617d99d12"} err="failed to get container status \"5351c5e4efab58655c45b4676976fc2dba72cdd8ce70b16ed40e2fe617d99d12\": rpc error: code = NotFound desc = could not find container \"5351c5e4efab58655c45b4676976fc2dba72cdd8ce70b16ed40e2fe617d99d12\": container with ID starting with 5351c5e4efab58655c45b4676976fc2dba72cdd8ce70b16ed40e2fe617d99d12 not found: ID does not exist" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.724049 4923 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37037463-da34-4501-8f0d-2190ad5ae59f-host\") on node \"crc\" DevicePath \"\"" Feb 24 04:00:38 crc kubenswrapper[4923]: I0224 04:00:38.724484 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7nng\" (UniqueName: \"kubernetes.io/projected/37037463-da34-4501-8f0d-2190ad5ae59f-kube-api-access-f7nng\") on node \"crc\" DevicePath \"\"" Feb 24 04:00:39 crc kubenswrapper[4923]: I0224 04:00:39.404623 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="879be6631b2a6c12745ac05722508ba07dd5cba052464abb0b8636c1f3fd5c9c" Feb 24 04:00:39 crc kubenswrapper[4923]: I0224 04:00:39.404629 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-przsk/crc-debug-clb8f" Feb 24 04:00:39 crc kubenswrapper[4923]: I0224 04:00:39.723648 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37037463-da34-4501-8f0d-2190ad5ae59f" path="/var/lib/kubelet/pods/37037463-da34-4501-8f0d-2190ad5ae59f/volumes" Feb 24 04:00:39 crc kubenswrapper[4923]: I0224 04:00:39.724656 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="429930fb-969e-4d0d-bcae-7eb35eb299ea" path="/var/lib/kubelet/pods/429930fb-969e-4d0d-bcae-7eb35eb299ea/volumes" Feb 24 04:00:39 crc kubenswrapper[4923]: I0224 04:00:39.957581 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-przsk/crc-debug-9klf4"] Feb 24 04:00:39 crc kubenswrapper[4923]: E0224 04:00:39.958219 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="429930fb-969e-4d0d-bcae-7eb35eb299ea" containerName="extract-utilities" Feb 24 04:00:39 crc kubenswrapper[4923]: I0224 04:00:39.958236 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="429930fb-969e-4d0d-bcae-7eb35eb299ea" containerName="extract-utilities" Feb 24 04:00:39 crc kubenswrapper[4923]: E0224 04:00:39.958269 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="429930fb-969e-4d0d-bcae-7eb35eb299ea" containerName="extract-content" Feb 24 04:00:39 crc kubenswrapper[4923]: I0224 04:00:39.958277 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="429930fb-969e-4d0d-bcae-7eb35eb299ea" containerName="extract-content" Feb 24 04:00:39 crc kubenswrapper[4923]: E0224 04:00:39.958290 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37037463-da34-4501-8f0d-2190ad5ae59f" containerName="container-00" Feb 24 04:00:39 crc kubenswrapper[4923]: I0224 04:00:39.958310 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="37037463-da34-4501-8f0d-2190ad5ae59f" containerName="container-00" Feb 24 04:00:39 crc kubenswrapper[4923]: E0224 04:00:39.958321 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="429930fb-969e-4d0d-bcae-7eb35eb299ea" containerName="registry-server" Feb 24 04:00:39 crc kubenswrapper[4923]: I0224 04:00:39.958327 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="429930fb-969e-4d0d-bcae-7eb35eb299ea" containerName="registry-server" Feb 24 04:00:39 crc kubenswrapper[4923]: I0224 04:00:39.958495 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="37037463-da34-4501-8f0d-2190ad5ae59f" containerName="container-00" Feb 24 04:00:39 crc kubenswrapper[4923]: I0224 04:00:39.958518 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="429930fb-969e-4d0d-bcae-7eb35eb299ea" containerName="registry-server" Feb 24 04:00:39 crc kubenswrapper[4923]: I0224 04:00:39.959074 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-przsk/crc-debug-9klf4" Feb 24 04:00:40 crc kubenswrapper[4923]: I0224 04:00:40.146738 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjqtk\" (UniqueName: \"kubernetes.io/projected/d11c7c5f-b7fa-4db1-a329-86d14afe3076-kube-api-access-mjqtk\") pod \"crc-debug-9klf4\" (UID: \"d11c7c5f-b7fa-4db1-a329-86d14afe3076\") " pod="openshift-must-gather-przsk/crc-debug-9klf4" Feb 24 04:00:40 crc kubenswrapper[4923]: I0224 04:00:40.146830 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11c7c5f-b7fa-4db1-a329-86d14afe3076-host\") pod \"crc-debug-9klf4\" (UID: \"d11c7c5f-b7fa-4db1-a329-86d14afe3076\") " pod="openshift-must-gather-przsk/crc-debug-9klf4" Feb 24 04:00:40 crc kubenswrapper[4923]: I0224 04:00:40.248400 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjqtk\" (UniqueName: \"kubernetes.io/projected/d11c7c5f-b7fa-4db1-a329-86d14afe3076-kube-api-access-mjqtk\") pod \"crc-debug-9klf4\" (UID: \"d11c7c5f-b7fa-4db1-a329-86d14afe3076\") " pod="openshift-must-gather-przsk/crc-debug-9klf4" Feb 24 04:00:40 crc kubenswrapper[4923]: I0224 04:00:40.248473 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11c7c5f-b7fa-4db1-a329-86d14afe3076-host\") pod \"crc-debug-9klf4\" (UID: \"d11c7c5f-b7fa-4db1-a329-86d14afe3076\") " pod="openshift-must-gather-przsk/crc-debug-9klf4" Feb 24 04:00:40 crc kubenswrapper[4923]: I0224 04:00:40.248665 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11c7c5f-b7fa-4db1-a329-86d14afe3076-host\") pod \"crc-debug-9klf4\" (UID: \"d11c7c5f-b7fa-4db1-a329-86d14afe3076\") " pod="openshift-must-gather-przsk/crc-debug-9klf4" Feb 24 04:00:40 crc kubenswrapper[4923]: I0224 04:00:40.282963 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjqtk\" (UniqueName: \"kubernetes.io/projected/d11c7c5f-b7fa-4db1-a329-86d14afe3076-kube-api-access-mjqtk\") pod \"crc-debug-9klf4\" (UID: \"d11c7c5f-b7fa-4db1-a329-86d14afe3076\") " pod="openshift-must-gather-przsk/crc-debug-9klf4" Feb 24 04:00:40 crc kubenswrapper[4923]: I0224 04:00:40.575559 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-przsk/crc-debug-9klf4" Feb 24 04:00:40 crc kubenswrapper[4923]: W0224 04:00:40.613966 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd11c7c5f_b7fa_4db1_a329_86d14afe3076.slice/crio-cbd694699d1102806a6af6db5314c0d80f50b23738bc8ca612ad03e9181fdeb8 WatchSource:0}: Error finding container cbd694699d1102806a6af6db5314c0d80f50b23738bc8ca612ad03e9181fdeb8: Status 404 returned error can't find the container with id cbd694699d1102806a6af6db5314c0d80f50b23738bc8ca612ad03e9181fdeb8 Feb 24 04:00:41 crc kubenswrapper[4923]: I0224 04:00:41.424050 4923 generic.go:334] "Generic (PLEG): container finished" podID="d11c7c5f-b7fa-4db1-a329-86d14afe3076" containerID="cdfad27bec8058c5e8549f7d99bb55ef1447577b00c265c12364146d679ead3f" exitCode=0 Feb 24 04:00:41 crc kubenswrapper[4923]: I0224 04:00:41.424126 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-przsk/crc-debug-9klf4" event={"ID":"d11c7c5f-b7fa-4db1-a329-86d14afe3076","Type":"ContainerDied","Data":"cdfad27bec8058c5e8549f7d99bb55ef1447577b00c265c12364146d679ead3f"} Feb 24 04:00:41 crc kubenswrapper[4923]: I0224 04:00:41.424426 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-przsk/crc-debug-9klf4" event={"ID":"d11c7c5f-b7fa-4db1-a329-86d14afe3076","Type":"ContainerStarted","Data":"cbd694699d1102806a6af6db5314c0d80f50b23738bc8ca612ad03e9181fdeb8"} Feb 24 04:00:41 crc kubenswrapper[4923]: I0224 04:00:41.863309 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-przsk/crc-debug-9klf4"] Feb 24 04:00:41 crc kubenswrapper[4923]: I0224 04:00:41.879171 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-przsk/crc-debug-9klf4"] Feb 24 04:00:42 crc kubenswrapper[4923]: I0224 04:00:42.533694 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-przsk/crc-debug-9klf4" Feb 24 04:00:42 crc kubenswrapper[4923]: I0224 04:00:42.686519 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11c7c5f-b7fa-4db1-a329-86d14afe3076-host\") pod \"d11c7c5f-b7fa-4db1-a329-86d14afe3076\" (UID: \"d11c7c5f-b7fa-4db1-a329-86d14afe3076\") " Feb 24 04:00:42 crc kubenswrapper[4923]: I0224 04:00:42.686634 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjqtk\" (UniqueName: \"kubernetes.io/projected/d11c7c5f-b7fa-4db1-a329-86d14afe3076-kube-api-access-mjqtk\") pod \"d11c7c5f-b7fa-4db1-a329-86d14afe3076\" (UID: \"d11c7c5f-b7fa-4db1-a329-86d14afe3076\") " Feb 24 04:00:42 crc kubenswrapper[4923]: I0224 04:00:42.686669 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d11c7c5f-b7fa-4db1-a329-86d14afe3076-host" (OuterVolumeSpecName: "host") pod "d11c7c5f-b7fa-4db1-a329-86d14afe3076" (UID: "d11c7c5f-b7fa-4db1-a329-86d14afe3076"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 04:00:42 crc kubenswrapper[4923]: I0224 04:00:42.691877 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11c7c5f-b7fa-4db1-a329-86d14afe3076-kube-api-access-mjqtk" (OuterVolumeSpecName: "kube-api-access-mjqtk") pod "d11c7c5f-b7fa-4db1-a329-86d14afe3076" (UID: "d11c7c5f-b7fa-4db1-a329-86d14afe3076"). InnerVolumeSpecName "kube-api-access-mjqtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 04:00:42 crc kubenswrapper[4923]: I0224 04:00:42.788663 4923 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11c7c5f-b7fa-4db1-a329-86d14afe3076-host\") on node \"crc\" DevicePath \"\"" Feb 24 04:00:42 crc kubenswrapper[4923]: I0224 04:00:42.788707 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjqtk\" (UniqueName: \"kubernetes.io/projected/d11c7c5f-b7fa-4db1-a329-86d14afe3076-kube-api-access-mjqtk\") on node \"crc\" DevicePath \"\"" Feb 24 04:00:43 crc kubenswrapper[4923]: I0224 04:00:43.048729 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-przsk/crc-debug-vdzzs"] Feb 24 04:00:43 crc kubenswrapper[4923]: E0224 04:00:43.051156 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11c7c5f-b7fa-4db1-a329-86d14afe3076" containerName="container-00" Feb 24 04:00:43 crc kubenswrapper[4923]: I0224 04:00:43.051259 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11c7c5f-b7fa-4db1-a329-86d14afe3076" containerName="container-00" Feb 24 04:00:43 crc kubenswrapper[4923]: I0224 04:00:43.051751 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11c7c5f-b7fa-4db1-a329-86d14afe3076" containerName="container-00" Feb 24 04:00:43 crc kubenswrapper[4923]: I0224 04:00:43.052945 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-przsk/crc-debug-vdzzs" Feb 24 04:00:43 crc kubenswrapper[4923]: I0224 04:00:43.194848 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f6f1a68-c1cd-499c-aa90-72b97cbd9c45-host\") pod \"crc-debug-vdzzs\" (UID: \"8f6f1a68-c1cd-499c-aa90-72b97cbd9c45\") " pod="openshift-must-gather-przsk/crc-debug-vdzzs" Feb 24 04:00:43 crc kubenswrapper[4923]: I0224 04:00:43.195277 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8bnp\" (UniqueName: \"kubernetes.io/projected/8f6f1a68-c1cd-499c-aa90-72b97cbd9c45-kube-api-access-c8bnp\") pod \"crc-debug-vdzzs\" (UID: \"8f6f1a68-c1cd-499c-aa90-72b97cbd9c45\") " pod="openshift-must-gather-przsk/crc-debug-vdzzs" Feb 24 04:00:43 crc kubenswrapper[4923]: I0224 04:00:43.297020 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f6f1a68-c1cd-499c-aa90-72b97cbd9c45-host\") pod \"crc-debug-vdzzs\" (UID: \"8f6f1a68-c1cd-499c-aa90-72b97cbd9c45\") " pod="openshift-must-gather-przsk/crc-debug-vdzzs" Feb 24 04:00:43 crc kubenswrapper[4923]: I0224 04:00:43.297164 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8bnp\" (UniqueName: \"kubernetes.io/projected/8f6f1a68-c1cd-499c-aa90-72b97cbd9c45-kube-api-access-c8bnp\") pod \"crc-debug-vdzzs\" (UID: \"8f6f1a68-c1cd-499c-aa90-72b97cbd9c45\") " pod="openshift-must-gather-przsk/crc-debug-vdzzs" Feb 24 04:00:43 crc kubenswrapper[4923]: I0224 04:00:43.297188 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f6f1a68-c1cd-499c-aa90-72b97cbd9c45-host\") pod \"crc-debug-vdzzs\" (UID: \"8f6f1a68-c1cd-499c-aa90-72b97cbd9c45\") " pod="openshift-must-gather-przsk/crc-debug-vdzzs" Feb 24 04:00:43 crc kubenswrapper[4923]: I0224 04:00:43.321019 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8bnp\" (UniqueName: \"kubernetes.io/projected/8f6f1a68-c1cd-499c-aa90-72b97cbd9c45-kube-api-access-c8bnp\") pod \"crc-debug-vdzzs\" (UID: \"8f6f1a68-c1cd-499c-aa90-72b97cbd9c45\") " pod="openshift-must-gather-przsk/crc-debug-vdzzs" Feb 24 04:00:43 crc kubenswrapper[4923]: I0224 04:00:43.380254 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-przsk/crc-debug-vdzzs" Feb 24 04:00:43 crc kubenswrapper[4923]: I0224 04:00:43.441265 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-przsk/crc-debug-9klf4" Feb 24 04:00:43 crc kubenswrapper[4923]: I0224 04:00:43.441347 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbd694699d1102806a6af6db5314c0d80f50b23738bc8ca612ad03e9181fdeb8" Feb 24 04:00:43 crc kubenswrapper[4923]: I0224 04:00:43.442371 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-przsk/crc-debug-vdzzs" event={"ID":"8f6f1a68-c1cd-499c-aa90-72b97cbd9c45","Type":"ContainerStarted","Data":"77764f8490dee4c322618f2458475eb64e688fe90c3ba3f9d83236e7d1c30111"} Feb 24 04:00:43 crc kubenswrapper[4923]: I0224 04:00:43.724288 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11c7c5f-b7fa-4db1-a329-86d14afe3076" path="/var/lib/kubelet/pods/d11c7c5f-b7fa-4db1-a329-86d14afe3076/volumes" Feb 24 04:00:44 crc kubenswrapper[4923]: I0224 04:00:44.451127 4923 generic.go:334] "Generic (PLEG): container finished" podID="8f6f1a68-c1cd-499c-aa90-72b97cbd9c45" containerID="19c73c039f1e5efce79de8aaf0f77dd3f68c2b1d68e700dc6b149739547f62ff" exitCode=0 Feb 24 04:00:44 crc kubenswrapper[4923]: I0224 04:00:44.451168 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-przsk/crc-debug-vdzzs" event={"ID":"8f6f1a68-c1cd-499c-aa90-72b97cbd9c45","Type":"ContainerDied","Data":"19c73c039f1e5efce79de8aaf0f77dd3f68c2b1d68e700dc6b149739547f62ff"} Feb 24 04:00:44 crc kubenswrapper[4923]: I0224 04:00:44.487820 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-przsk/crc-debug-vdzzs"] Feb 24 04:00:44 crc kubenswrapper[4923]: I0224 04:00:44.495331 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-przsk/crc-debug-vdzzs"] Feb 24 04:00:45 crc kubenswrapper[4923]: I0224 04:00:45.546567 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-przsk/crc-debug-vdzzs" Feb 24 04:00:45 crc kubenswrapper[4923]: I0224 04:00:45.638039 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8bnp\" (UniqueName: \"kubernetes.io/projected/8f6f1a68-c1cd-499c-aa90-72b97cbd9c45-kube-api-access-c8bnp\") pod \"8f6f1a68-c1cd-499c-aa90-72b97cbd9c45\" (UID: \"8f6f1a68-c1cd-499c-aa90-72b97cbd9c45\") " Feb 24 04:00:45 crc kubenswrapper[4923]: I0224 04:00:45.638201 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f6f1a68-c1cd-499c-aa90-72b97cbd9c45-host\") pod \"8f6f1a68-c1cd-499c-aa90-72b97cbd9c45\" (UID: \"8f6f1a68-c1cd-499c-aa90-72b97cbd9c45\") " Feb 24 04:00:45 crc kubenswrapper[4923]: I0224 04:00:45.638323 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f6f1a68-c1cd-499c-aa90-72b97cbd9c45-host" (OuterVolumeSpecName: "host") pod "8f6f1a68-c1cd-499c-aa90-72b97cbd9c45" (UID: "8f6f1a68-c1cd-499c-aa90-72b97cbd9c45"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 04:00:45 crc kubenswrapper[4923]: I0224 04:00:45.638688 4923 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f6f1a68-c1cd-499c-aa90-72b97cbd9c45-host\") on node \"crc\" DevicePath \"\"" Feb 24 04:00:45 crc kubenswrapper[4923]: I0224 04:00:45.644214 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f6f1a68-c1cd-499c-aa90-72b97cbd9c45-kube-api-access-c8bnp" (OuterVolumeSpecName: "kube-api-access-c8bnp") pod "8f6f1a68-c1cd-499c-aa90-72b97cbd9c45" (UID: "8f6f1a68-c1cd-499c-aa90-72b97cbd9c45"). InnerVolumeSpecName "kube-api-access-c8bnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 04:00:45 crc kubenswrapper[4923]: I0224 04:00:45.727443 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f6f1a68-c1cd-499c-aa90-72b97cbd9c45" path="/var/lib/kubelet/pods/8f6f1a68-c1cd-499c-aa90-72b97cbd9c45/volumes" Feb 24 04:00:45 crc kubenswrapper[4923]: I0224 04:00:45.739928 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8bnp\" (UniqueName: \"kubernetes.io/projected/8f6f1a68-c1cd-499c-aa90-72b97cbd9c45-kube-api-access-c8bnp\") on node \"crc\" DevicePath \"\"" Feb 24 04:00:46 crc kubenswrapper[4923]: I0224 04:00:46.468047 4923 scope.go:117] "RemoveContainer" containerID="19c73c039f1e5efce79de8aaf0f77dd3f68c2b1d68e700dc6b149739547f62ff" Feb 24 04:00:46 crc kubenswrapper[4923]: I0224 04:00:46.468631 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-przsk/crc-debug-vdzzs" Feb 24 04:00:46 crc kubenswrapper[4923]: I0224 04:00:46.713160 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:00:46 crc kubenswrapper[4923]: E0224 04:00:46.713758 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:00:59 crc kubenswrapper[4923]: I0224 04:00:59.261994 4923 scope.go:117] "RemoveContainer" containerID="0adaafb21f3da3758ef9730ab67e02aafcc6f9b5dae51df2c4c4fa12c15bf219" Feb 24 04:01:00 crc kubenswrapper[4923]: I0224 04:01:00.150065 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29531761-xszfj"] Feb 24 04:01:00 crc kubenswrapper[4923]: E0224 04:01:00.150772 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f6f1a68-c1cd-499c-aa90-72b97cbd9c45" containerName="container-00" Feb 24 04:01:00 crc kubenswrapper[4923]: I0224 04:01:00.150788 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f6f1a68-c1cd-499c-aa90-72b97cbd9c45" containerName="container-00" Feb 24 04:01:00 crc kubenswrapper[4923]: I0224 04:01:00.150965 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f6f1a68-c1cd-499c-aa90-72b97cbd9c45" containerName="container-00" Feb 24 04:01:00 crc kubenswrapper[4923]: I0224 04:01:00.151613 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29531761-xszfj" Feb 24 04:01:00 crc kubenswrapper[4923]: I0224 04:01:00.160741 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29531761-xszfj"] Feb 24 04:01:00 crc kubenswrapper[4923]: I0224 04:01:00.335687 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/993f74fa-4266-4bab-a161-f6ffffe458a5-combined-ca-bundle\") pod \"keystone-cron-29531761-xszfj\" (UID: \"993f74fa-4266-4bab-a161-f6ffffe458a5\") " pod="openstack/keystone-cron-29531761-xszfj" Feb 24 04:01:00 crc kubenswrapper[4923]: I0224 04:01:00.335733 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/993f74fa-4266-4bab-a161-f6ffffe458a5-fernet-keys\") pod \"keystone-cron-29531761-xszfj\" (UID: \"993f74fa-4266-4bab-a161-f6ffffe458a5\") " pod="openstack/keystone-cron-29531761-xszfj" Feb 24 04:01:00 crc kubenswrapper[4923]: I0224 04:01:00.336047 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv82w\" (UniqueName: \"kubernetes.io/projected/993f74fa-4266-4bab-a161-f6ffffe458a5-kube-api-access-kv82w\") pod \"keystone-cron-29531761-xszfj\" (UID: \"993f74fa-4266-4bab-a161-f6ffffe458a5\") " pod="openstack/keystone-cron-29531761-xszfj" Feb 24 04:01:00 crc kubenswrapper[4923]: I0224 04:01:00.336103 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/993f74fa-4266-4bab-a161-f6ffffe458a5-config-data\") pod \"keystone-cron-29531761-xszfj\" (UID: \"993f74fa-4266-4bab-a161-f6ffffe458a5\") " pod="openstack/keystone-cron-29531761-xszfj" Feb 24 04:01:00 crc kubenswrapper[4923]: I0224 04:01:00.437898 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/993f74fa-4266-4bab-a161-f6ffffe458a5-combined-ca-bundle\") pod \"keystone-cron-29531761-xszfj\" (UID: \"993f74fa-4266-4bab-a161-f6ffffe458a5\") " pod="openstack/keystone-cron-29531761-xszfj" Feb 24 04:01:00 crc kubenswrapper[4923]: I0224 04:01:00.437951 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/993f74fa-4266-4bab-a161-f6ffffe458a5-fernet-keys\") pod \"keystone-cron-29531761-xszfj\" (UID: \"993f74fa-4266-4bab-a161-f6ffffe458a5\") " pod="openstack/keystone-cron-29531761-xszfj" Feb 24 04:01:00 crc kubenswrapper[4923]: I0224 04:01:00.438040 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/993f74fa-4266-4bab-a161-f6ffffe458a5-config-data\") pod \"keystone-cron-29531761-xszfj\" (UID: \"993f74fa-4266-4bab-a161-f6ffffe458a5\") " pod="openstack/keystone-cron-29531761-xszfj" Feb 24 04:01:00 crc kubenswrapper[4923]: I0224 04:01:00.438059 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv82w\" (UniqueName: \"kubernetes.io/projected/993f74fa-4266-4bab-a161-f6ffffe458a5-kube-api-access-kv82w\") pod \"keystone-cron-29531761-xszfj\" (UID: \"993f74fa-4266-4bab-a161-f6ffffe458a5\") " pod="openstack/keystone-cron-29531761-xszfj" Feb 24 04:01:00 crc kubenswrapper[4923]: I0224 04:01:00.446199 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/993f74fa-4266-4bab-a161-f6ffffe458a5-config-data\") pod \"keystone-cron-29531761-xszfj\" (UID: \"993f74fa-4266-4bab-a161-f6ffffe458a5\") " pod="openstack/keystone-cron-29531761-xszfj" Feb 24 04:01:00 crc kubenswrapper[4923]: I0224 04:01:00.448961 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/993f74fa-4266-4bab-a161-f6ffffe458a5-fernet-keys\") pod \"keystone-cron-29531761-xszfj\" (UID: \"993f74fa-4266-4bab-a161-f6ffffe458a5\") " pod="openstack/keystone-cron-29531761-xszfj" Feb 24 04:01:00 crc kubenswrapper[4923]: I0224 04:01:00.449292 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/993f74fa-4266-4bab-a161-f6ffffe458a5-combined-ca-bundle\") pod \"keystone-cron-29531761-xszfj\" (UID: \"993f74fa-4266-4bab-a161-f6ffffe458a5\") " pod="openstack/keystone-cron-29531761-xszfj" Feb 24 04:01:00 crc kubenswrapper[4923]: I0224 04:01:00.456701 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv82w\" (UniqueName: \"kubernetes.io/projected/993f74fa-4266-4bab-a161-f6ffffe458a5-kube-api-access-kv82w\") pod \"keystone-cron-29531761-xszfj\" (UID: \"993f74fa-4266-4bab-a161-f6ffffe458a5\") " pod="openstack/keystone-cron-29531761-xszfj" Feb 24 04:01:00 crc kubenswrapper[4923]: I0224 04:01:00.474277 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29531761-xszfj" Feb 24 04:01:01 crc kubenswrapper[4923]: I0224 04:01:00.940733 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29531761-xszfj"] Feb 24 04:01:01 crc kubenswrapper[4923]: I0224 04:01:01.625163 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29531761-xszfj" event={"ID":"993f74fa-4266-4bab-a161-f6ffffe458a5","Type":"ContainerStarted","Data":"2483a890ad03f7b96d8118abd0508444a3f0732da1dd085621aa3d6015e14ab2"} Feb 24 04:01:01 crc kubenswrapper[4923]: I0224 04:01:01.625534 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29531761-xszfj" event={"ID":"993f74fa-4266-4bab-a161-f6ffffe458a5","Type":"ContainerStarted","Data":"d319c850d00bae87a9edb9256048fa5f52df23b5c507c72fd75b98dc7ecdcd4d"} Feb 24 04:01:01 crc kubenswrapper[4923]: I0224 04:01:01.644166 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29531761-xszfj" podStartSLOduration=1.644147802 podStartE2EDuration="1.644147802s" podCreationTimestamp="2026-02-24 04:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 04:01:01.638762232 +0000 UTC m=+3985.655833065" watchObservedRunningTime="2026-02-24 04:01:01.644147802 +0000 UTC m=+3985.661218605" Feb 24 04:01:01 crc kubenswrapper[4923]: I0224 04:01:01.715812 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:01:01 crc kubenswrapper[4923]: E0224 04:01:01.716062 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:01:03 crc kubenswrapper[4923]: I0224 04:01:03.650052 4923 generic.go:334] "Generic (PLEG): container finished" podID="993f74fa-4266-4bab-a161-f6ffffe458a5" containerID="2483a890ad03f7b96d8118abd0508444a3f0732da1dd085621aa3d6015e14ab2" exitCode=0 Feb 24 04:01:03 crc kubenswrapper[4923]: I0224 04:01:03.650130 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29531761-xszfj" event={"ID":"993f74fa-4266-4bab-a161-f6ffffe458a5","Type":"ContainerDied","Data":"2483a890ad03f7b96d8118abd0508444a3f0732da1dd085621aa3d6015e14ab2"} Feb 24 04:01:05 crc kubenswrapper[4923]: I0224 04:01:05.029854 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29531761-xszfj" Feb 24 04:01:05 crc kubenswrapper[4923]: I0224 04:01:05.122689 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/993f74fa-4266-4bab-a161-f6ffffe458a5-fernet-keys\") pod \"993f74fa-4266-4bab-a161-f6ffffe458a5\" (UID: \"993f74fa-4266-4bab-a161-f6ffffe458a5\") " Feb 24 04:01:05 crc kubenswrapper[4923]: I0224 04:01:05.122742 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv82w\" (UniqueName: \"kubernetes.io/projected/993f74fa-4266-4bab-a161-f6ffffe458a5-kube-api-access-kv82w\") pod \"993f74fa-4266-4bab-a161-f6ffffe458a5\" (UID: \"993f74fa-4266-4bab-a161-f6ffffe458a5\") " Feb 24 04:01:05 crc kubenswrapper[4923]: I0224 04:01:05.122900 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/993f74fa-4266-4bab-a161-f6ffffe458a5-config-data\") pod \"993f74fa-4266-4bab-a161-f6ffffe458a5\" (UID: \"993f74fa-4266-4bab-a161-f6ffffe458a5\") " Feb 24 04:01:05 crc kubenswrapper[4923]: I0224 04:01:05.122951 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/993f74fa-4266-4bab-a161-f6ffffe458a5-combined-ca-bundle\") pod \"993f74fa-4266-4bab-a161-f6ffffe458a5\" (UID: \"993f74fa-4266-4bab-a161-f6ffffe458a5\") " Feb 24 04:01:05 crc kubenswrapper[4923]: I0224 04:01:05.129236 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/993f74fa-4266-4bab-a161-f6ffffe458a5-kube-api-access-kv82w" (OuterVolumeSpecName: "kube-api-access-kv82w") pod "993f74fa-4266-4bab-a161-f6ffffe458a5" (UID: "993f74fa-4266-4bab-a161-f6ffffe458a5"). InnerVolumeSpecName "kube-api-access-kv82w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 04:01:05 crc kubenswrapper[4923]: I0224 04:01:05.132440 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/993f74fa-4266-4bab-a161-f6ffffe458a5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "993f74fa-4266-4bab-a161-f6ffffe458a5" (UID: "993f74fa-4266-4bab-a161-f6ffffe458a5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 04:01:05 crc kubenswrapper[4923]: I0224 04:01:05.156578 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/993f74fa-4266-4bab-a161-f6ffffe458a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "993f74fa-4266-4bab-a161-f6ffffe458a5" (UID: "993f74fa-4266-4bab-a161-f6ffffe458a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 04:01:05 crc kubenswrapper[4923]: I0224 04:01:05.180547 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/993f74fa-4266-4bab-a161-f6ffffe458a5-config-data" (OuterVolumeSpecName: "config-data") pod "993f74fa-4266-4bab-a161-f6ffffe458a5" (UID: "993f74fa-4266-4bab-a161-f6ffffe458a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 04:01:05 crc kubenswrapper[4923]: I0224 04:01:05.225214 4923 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/993f74fa-4266-4bab-a161-f6ffffe458a5-config-data\") on node \"crc\" DevicePath \"\"" Feb 24 04:01:05 crc kubenswrapper[4923]: I0224 04:01:05.225253 4923 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/993f74fa-4266-4bab-a161-f6ffffe458a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 04:01:05 crc kubenswrapper[4923]: I0224 04:01:05.225267 4923 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/993f74fa-4266-4bab-a161-f6ffffe458a5-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 24 04:01:05 crc kubenswrapper[4923]: I0224 04:01:05.225278 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv82w\" (UniqueName: \"kubernetes.io/projected/993f74fa-4266-4bab-a161-f6ffffe458a5-kube-api-access-kv82w\") on node \"crc\" DevicePath \"\"" Feb 24 04:01:05 crc kubenswrapper[4923]: I0224 04:01:05.667271 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29531761-xszfj" event={"ID":"993f74fa-4266-4bab-a161-f6ffffe458a5","Type":"ContainerDied","Data":"d319c850d00bae87a9edb9256048fa5f52df23b5c507c72fd75b98dc7ecdcd4d"} Feb 24 04:01:05 crc kubenswrapper[4923]: I0224 04:01:05.667639 4923 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d319c850d00bae87a9edb9256048fa5f52df23b5c507c72fd75b98dc7ecdcd4d" Feb 24 04:01:05 crc kubenswrapper[4923]: I0224 04:01:05.667367 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29531761-xszfj" Feb 24 04:01:13 crc kubenswrapper[4923]: I0224 04:01:13.714529 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:01:13 crc kubenswrapper[4923]: E0224 04:01:13.715413 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:01:14 crc kubenswrapper[4923]: I0224 04:01:14.262721 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7d7c54dbbb-xcg2j_6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed/barbican-api/0.log" Feb 24 04:01:15 crc kubenswrapper[4923]: I0224 04:01:15.066682 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7d7c54dbbb-xcg2j_6ddc5ce5-ccc7-4db9-82e7-d42f8ea0f7ed/barbican-api-log/0.log" Feb 24 04:01:15 crc kubenswrapper[4923]: I0224 04:01:15.124136 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-895b8674b-v44h4_774cca46-21ee-41c1-81e7-00c89c26ad37/barbican-keystone-listener-log/0.log" Feb 24 04:01:15 crc kubenswrapper[4923]: I0224 04:01:15.153649 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-895b8674b-v44h4_774cca46-21ee-41c1-81e7-00c89c26ad37/barbican-keystone-listener/0.log" Feb 24 04:01:15 crc kubenswrapper[4923]: I0224 04:01:15.298973 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c8bfc6649-mz55h_4cafcd89-7a31-47f2-980b-9b9a6a21bd49/barbican-worker/0.log" Feb 24 04:01:15 crc kubenswrapper[4923]: I0224 04:01:15.351441 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c8bfc6649-mz55h_4cafcd89-7a31-47f2-980b-9b9a6a21bd49/barbican-worker-log/0.log" Feb 24 04:01:15 crc kubenswrapper[4923]: I0224 04:01:15.475761 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-7nghb_dd6fe20f-e2e1-46ae-aa88-cbfd410076a2/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 04:01:15 crc kubenswrapper[4923]: I0224 04:01:15.543403 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0af0866b-f6b6-45cb-9322-25fc22f6b6b4/ceilometer-central-agent/0.log" Feb 24 04:01:15 crc kubenswrapper[4923]: I0224 04:01:15.617483 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0af0866b-f6b6-45cb-9322-25fc22f6b6b4/ceilometer-notification-agent/0.log" Feb 24 04:01:15 crc kubenswrapper[4923]: I0224 04:01:15.652284 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0af0866b-f6b6-45cb-9322-25fc22f6b6b4/proxy-httpd/0.log" Feb 24 04:01:15 crc kubenswrapper[4923]: I0224 04:01:15.735065 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0af0866b-f6b6-45cb-9322-25fc22f6b6b4/sg-core/0.log" Feb 24 04:01:15 crc kubenswrapper[4923]: I0224 04:01:15.830914 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4e3799b0-c8b2-4204-8b12-62e28dee2c09/cinder-api-log/0.log" Feb 24 04:01:15 crc kubenswrapper[4923]: I0224 04:01:15.851459 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4e3799b0-c8b2-4204-8b12-62e28dee2c09/cinder-api/0.log" Feb 24 04:01:16 crc kubenswrapper[4923]: I0224 04:01:16.005902 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf/cinder-scheduler/0.log" Feb 24 04:01:16 crc kubenswrapper[4923]: I0224 04:01:16.049069 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6ba44e97-ed2b-4e52-8f38-3279a2fdb3bf/probe/0.log" Feb 24 04:01:16 crc kubenswrapper[4923]: I0224 04:01:16.167340 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-r6rg2_8dbbe8ec-f9b0-4dfe-a1ae-63ff9e7f1355/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 04:01:16 crc kubenswrapper[4923]: I0224 04:01:16.225015 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-smdtc_ec71f0e3-4ff0-46b6-a887-37132374b80c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 04:01:16 crc kubenswrapper[4923]: I0224 04:01:16.372057 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-4v9fw_7ad3dfbc-174b-4b0c-9d41-a0c51eead210/init/0.log" Feb 24 04:01:16 crc kubenswrapper[4923]: I0224 04:01:16.565500 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-4v9fw_7ad3dfbc-174b-4b0c-9d41-a0c51eead210/init/0.log" Feb 24 04:01:16 crc kubenswrapper[4923]: I0224 04:01:16.640669 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-vlfgw_04b83327-1210-4a1a-b104-70fff61786bf/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 04:01:16 crc kubenswrapper[4923]: I0224 04:01:16.693671 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-4v9fw_7ad3dfbc-174b-4b0c-9d41-a0c51eead210/dnsmasq-dns/0.log" Feb 24 04:01:16 crc kubenswrapper[4923]: I0224 04:01:16.804991 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3b54c615-8156-4ec6-aee7-b8c9448a574e/glance-httpd/0.log" Feb 24 04:01:16 crc kubenswrapper[4923]: I0224 04:01:16.844529 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3b54c615-8156-4ec6-aee7-b8c9448a574e/glance-log/0.log" Feb 24 04:01:17 crc kubenswrapper[4923]: I0224 04:01:17.003099 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2656e2e0-085f-443d-ad1c-2243a4f92a11/glance-httpd/0.log" Feb 24 04:01:17 crc kubenswrapper[4923]: I0224 04:01:17.032333 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2656e2e0-085f-443d-ad1c-2243a4f92a11/glance-log/0.log" Feb 24 04:01:17 crc kubenswrapper[4923]: I0224 04:01:17.273803 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6dcbd8cd94-497ns_3cad919b-bb41-4c17-a13a-01831e715fd9/horizon/0.log" Feb 24 04:01:17 crc kubenswrapper[4923]: I0224 04:01:17.421701 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-zlsm9_f8dab472-e2b2-4eab-8ced-7eed7b1bc842/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 04:01:17 crc kubenswrapper[4923]: I0224 04:01:17.591621 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-sg2sp_140f3efa-43c3-4d0b-a738-fc87e216c13b/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 04:01:17 crc kubenswrapper[4923]: I0224 04:01:17.608486 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6dcbd8cd94-497ns_3cad919b-bb41-4c17-a13a-01831e715fd9/horizon-log/0.log" Feb 24 04:01:17 crc kubenswrapper[4923]: I0224 04:01:17.838592 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29531761-xszfj_993f74fa-4266-4bab-a161-f6ffffe458a5/keystone-cron/0.log" Feb 24 04:01:17 crc kubenswrapper[4923]: I0224 04:01:17.910643 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6cfd87c4f7-w99br_26c252fe-d59a-4053-946d-b75bea1a9c0b/keystone-api/0.log" Feb 24 04:01:18 crc kubenswrapper[4923]: I0224 04:01:18.020050 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_18af060f-9e29-435c-82a9-6bdd59867a46/kube-state-metrics/0.log" Feb 24 04:01:18 crc kubenswrapper[4923]: I0224 04:01:18.151269 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-ftkrn_aca2992d-fbda-4dad-8ab4-02147a40ed9e/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 04:01:18 crc kubenswrapper[4923]: I0224 04:01:18.443851 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-55b6b875d5-hmfv4_a13b787e-2ba9-4a5b-96d0-c1d044f4c958/neutron-httpd/0.log" Feb 24 04:01:18 crc kubenswrapper[4923]: I0224 04:01:18.460539 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-55b6b875d5-hmfv4_a13b787e-2ba9-4a5b-96d0-c1d044f4c958/neutron-api/0.log" Feb 24 04:01:18 crc kubenswrapper[4923]: I0224 04:01:18.484268 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-f587d_a55af564-f005-452b-acb3-8fa3910b1485/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 04:01:19 crc kubenswrapper[4923]: I0224 04:01:19.059524 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_633331a8-df46-4c85-b234-1e2820565794/nova-api-log/0.log" Feb 24 04:01:19 crc kubenswrapper[4923]: I0224 04:01:19.103366 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_5c079676-c0fa-49dd-94fe-360388e5014d/nova-cell0-conductor-conductor/0.log" Feb 24 04:01:19 crc kubenswrapper[4923]: I0224 04:01:19.404163 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_5130427e-ca28-4060-ac80-72202959e07f/nova-cell1-conductor-conductor/0.log" Feb 24 04:01:19 crc kubenswrapper[4923]: I0224 04:01:19.487075 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_7f3c9b5a-80b2-4acb-8a53-daf7f1b168a3/nova-cell1-novncproxy-novncproxy/0.log" Feb 24 04:01:19 crc kubenswrapper[4923]: I0224 04:01:19.544621 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_633331a8-df46-4c85-b234-1e2820565794/nova-api-api/0.log" Feb 24 04:01:19 crc kubenswrapper[4923]: I0224 04:01:19.629533 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-rfz2h_08f21e51-2e83-4b47-b794-1e3a2358381d/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 04:01:19 crc kubenswrapper[4923]: I0224 04:01:19.790184 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9f2c858b-ff6d-44cb-9925-d4c0ef27f133/nova-metadata-log/0.log" Feb 24 04:01:20 crc kubenswrapper[4923]: I0224 04:01:20.027040 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b2879b26-9173-4d23-b6f4-9c9e4c43f08e/mysql-bootstrap/0.log" Feb 24 04:01:20 crc kubenswrapper[4923]: I0224 04:01:20.163626 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_5a8d4b38-c639-42b0-a48a-6193bee91648/nova-scheduler-scheduler/0.log" Feb 24 04:01:20 crc kubenswrapper[4923]: I0224 04:01:20.230009 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b2879b26-9173-4d23-b6f4-9c9e4c43f08e/mysql-bootstrap/0.log" Feb 24 04:01:20 crc kubenswrapper[4923]: I0224 04:01:20.316541 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b2879b26-9173-4d23-b6f4-9c9e4c43f08e/galera/0.log" Feb 24 04:01:20 crc kubenswrapper[4923]: I0224 04:01:20.464438 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_71ebe37b-5831-4545-8f6a-8db6e194982f/mysql-bootstrap/0.log" Feb 24 04:01:20 crc kubenswrapper[4923]: I0224 04:01:20.635338 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_71ebe37b-5831-4545-8f6a-8db6e194982f/galera/0.log" Feb 24 04:01:20 crc kubenswrapper[4923]: I0224 04:01:20.682425 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_71ebe37b-5831-4545-8f6a-8db6e194982f/mysql-bootstrap/0.log" Feb 24 04:01:20 crc kubenswrapper[4923]: I0224 04:01:20.823007 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_6c4c5ccb-70c6-4e12-9bd6-d3be268e67a3/openstackclient/0.log" Feb 24 04:01:20 crc kubenswrapper[4923]: I0224 04:01:20.894279 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6l624_6bf6d02b-6b2b-4535-a1b7-6c9c3b7f5095/ovn-controller/0.log" Feb 24 04:01:21 crc kubenswrapper[4923]: I0224 04:01:21.102199 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-pnpxx_459e20ec-ab36-4745-9a6b-8c3832560d72/openstack-network-exporter/0.log" Feb 24 04:01:21 crc kubenswrapper[4923]: I0224 04:01:21.125500 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9f2c858b-ff6d-44cb-9925-d4c0ef27f133/nova-metadata-metadata/0.log" Feb 24 04:01:21 crc kubenswrapper[4923]: I0224 04:01:21.246490 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-555wh_22ccde27-2e54-4d62-8cc4-8b12ea5e92a7/ovsdb-server-init/0.log" Feb 24 04:01:21 crc kubenswrapper[4923]: I0224 04:01:21.438885 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-555wh_22ccde27-2e54-4d62-8cc4-8b12ea5e92a7/ovsdb-server-init/0.log" Feb 24 04:01:21 crc kubenswrapper[4923]: I0224 04:01:21.487454 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-555wh_22ccde27-2e54-4d62-8cc4-8b12ea5e92a7/ovs-vswitchd/0.log" Feb 24 04:01:21 crc kubenswrapper[4923]: I0224 04:01:21.503721 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-555wh_22ccde27-2e54-4d62-8cc4-8b12ea5e92a7/ovsdb-server/0.log" Feb 24 04:01:21 crc kubenswrapper[4923]: I0224 04:01:21.722098 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e0d59d8f-593d-437e-9450-93fb5bbaa025/ovn-northd/0.log" Feb 24 04:01:21 crc kubenswrapper[4923]: I0224 04:01:21.723058 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e0d59d8f-593d-437e-9450-93fb5bbaa025/openstack-network-exporter/0.log" Feb 24 04:01:21 crc kubenswrapper[4923]: I0224 04:01:21.752174 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-hsl52_543b3843-407e-4043-a851-4170590b5a68/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 04:01:21 crc kubenswrapper[4923]: I0224 04:01:21.937565 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_34d16b71-0cf5-4143-9225-3e44441dc2da/openstack-network-exporter/0.log" Feb 24 04:01:22 crc kubenswrapper[4923]: I0224 04:01:22.038772 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_34d16b71-0cf5-4143-9225-3e44441dc2da/ovsdbserver-nb/0.log" Feb 24 04:01:22 crc kubenswrapper[4923]: I0224 04:01:22.598427 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c976efe6-239a-4f24-a392-b1b5ba3545de/openstack-network-exporter/0.log" Feb 24 04:01:22 crc kubenswrapper[4923]: I0224 04:01:22.633274 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c976efe6-239a-4f24-a392-b1b5ba3545de/ovsdbserver-sb/0.log" Feb 24 04:01:22 crc kubenswrapper[4923]: I0224 04:01:22.676628 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9d7999766-h8pkz_88a4bad2-fdbb-4186-b218-093ff0cf4b9c/placement-api/0.log" Feb 24 04:01:22 crc kubenswrapper[4923]: I0224 04:01:22.888762 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9d7999766-h8pkz_88a4bad2-fdbb-4186-b218-093ff0cf4b9c/placement-log/0.log" Feb 24 04:01:22 crc kubenswrapper[4923]: I0224 04:01:22.927335 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6e4608b5-cf65-4bbc-b509-85261127fe10/setup-container/0.log" Feb 24 04:01:23 crc kubenswrapper[4923]: I0224 04:01:23.062735 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6e4608b5-cf65-4bbc-b509-85261127fe10/setup-container/0.log" Feb 24 04:01:23 crc kubenswrapper[4923]: I0224 04:01:23.118449 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6e4608b5-cf65-4bbc-b509-85261127fe10/rabbitmq/0.log" Feb 24 04:01:23 crc kubenswrapper[4923]: I0224 04:01:23.155715 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4bd51e0b-15c9-4042-ac7e-c05ed0a11374/setup-container/0.log" Feb 24 04:01:23 crc kubenswrapper[4923]: I0224 04:01:23.314931 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4bd51e0b-15c9-4042-ac7e-c05ed0a11374/setup-container/0.log" Feb 24 04:01:23 crc kubenswrapper[4923]: I0224 04:01:23.378488 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4bd51e0b-15c9-4042-ac7e-c05ed0a11374/rabbitmq/0.log" Feb 24 04:01:23 crc kubenswrapper[4923]: I0224 04:01:23.382481 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-976cg_3444189b-88ae-469b-810d-e92a9a0c17d8/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 04:01:23 crc kubenswrapper[4923]: I0224 04:01:23.589904 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-r85nc_3816ebdb-67f1-4d77-835e-fd9323d883fd/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 04:01:23 crc kubenswrapper[4923]: I0224 04:01:23.654471 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-hngc6_d5eb03b7-77c3-4c05-a735-ce0c901c91cb/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 04:01:24 crc kubenswrapper[4923]: I0224 04:01:24.193776 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-m2jt2_b51afcd9-da3d-4f68-947a-c6af0a02cfaa/ssh-known-hosts-edpm-deployment/0.log" Feb 24 04:01:24 crc kubenswrapper[4923]: I0224 04:01:24.213623 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-f4pqm_a784fe18-eb1d-4e0d-84cb-9268b1904302/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 04:01:24 crc kubenswrapper[4923]: I0224 04:01:24.496045 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-645cdc8bdf-bkt49_28a2632f-7155-4c9e-9767-fcda3ff0688b/proxy-server/0.log" Feb 24 04:01:24 crc kubenswrapper[4923]: I0224 04:01:24.562016 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-84w4c_6113f2e8-dd3f-42d4-92f3-8fd56e4b458c/swift-ring-rebalance/0.log" Feb 24 04:01:24 crc kubenswrapper[4923]: I0224 04:01:24.564141 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-645cdc8bdf-bkt49_28a2632f-7155-4c9e-9767-fcda3ff0688b/proxy-httpd/0.log" Feb 24 04:01:24 crc kubenswrapper[4923]: I0224 04:01:24.820841 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/account-auditor/0.log" Feb 24 04:01:24 crc kubenswrapper[4923]: I0224 04:01:24.826217 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/account-replicator/0.log" Feb 24 04:01:24 crc kubenswrapper[4923]: I0224 04:01:24.865077 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/account-reaper/0.log" Feb 24 04:01:24 crc kubenswrapper[4923]: I0224 04:01:24.909780 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/account-server/0.log" Feb 24 04:01:25 crc kubenswrapper[4923]: I0224 04:01:25.028557 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/container-server/0.log" Feb 24 04:01:25 crc kubenswrapper[4923]: I0224 04:01:25.042593 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/container-auditor/0.log" Feb 24 04:01:25 crc kubenswrapper[4923]: I0224 04:01:25.103754 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/container-replicator/0.log" Feb 24 04:01:25 crc kubenswrapper[4923]: I0224 04:01:25.197499 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/container-updater/0.log" Feb 24 04:01:25 crc kubenswrapper[4923]: I0224 04:01:25.247237 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/object-auditor/0.log" Feb 24 04:01:25 crc kubenswrapper[4923]: I0224 04:01:25.331174 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/object-expirer/0.log" Feb 24 04:01:25 crc kubenswrapper[4923]: I0224 04:01:25.347202 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/object-replicator/0.log" Feb 24 04:01:25 crc kubenswrapper[4923]: I0224 04:01:25.386850 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/object-server/0.log" Feb 24 04:01:25 crc kubenswrapper[4923]: I0224 04:01:25.432618 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/object-updater/0.log" Feb 24 04:01:25 crc kubenswrapper[4923]: I0224 04:01:25.537904 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/rsync/0.log" Feb 24 04:01:25 crc kubenswrapper[4923]: I0224 04:01:25.548977 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d2a2ae80-7d5a-47b5-ba9b-eebe33a1799f/swift-recon-cron/0.log" Feb 24 04:01:25 crc kubenswrapper[4923]: I0224 04:01:25.705886 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-xrsk8_d8be4d6b-1c52-43ae-addf-ad44faf403f2/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 04:01:25 crc kubenswrapper[4923]: I0224 04:01:25.797939 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_9b6f2b0b-f8d2-4a36-a1a2-177dcf809761/tempest-tests-tempest-tests-runner/0.log" Feb 24 04:01:25 crc kubenswrapper[4923]: I0224 04:01:25.940332 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_ff180506-c96d-4b80-8568-e972c702ff06/test-operator-logs-container/0.log" Feb 24 04:01:26 crc kubenswrapper[4923]: I0224 04:01:26.060905 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-dxjc8_890d3a1a-7dcc-4033-95c0-a3507815e8ff/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 24 04:01:28 crc kubenswrapper[4923]: I0224 04:01:28.713041 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:01:28 crc kubenswrapper[4923]: E0224 04:01:28.713637 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:01:36 crc kubenswrapper[4923]: I0224 04:01:36.949709 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_2c9c3801-205a-40fd-929f-587f5aaa9ca2/memcached/0.log" Feb 24 04:01:39 crc kubenswrapper[4923]: I0224 04:01:39.713159 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:01:39 crc kubenswrapper[4923]: E0224 04:01:39.713921 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:01:51 crc kubenswrapper[4923]: I0224 04:01:51.716850 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:01:51 crc kubenswrapper[4923]: E0224 04:01:51.717645 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:01:53 crc kubenswrapper[4923]: I0224 04:01:53.356152 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t_0fd27016-8094-45ad-9298-4f33e7692b7e/util/0.log" Feb 24 04:01:53 crc kubenswrapper[4923]: I0224 04:01:53.573039 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t_0fd27016-8094-45ad-9298-4f33e7692b7e/util/0.log" Feb 24 04:01:53 crc kubenswrapper[4923]: I0224 04:01:53.577194 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t_0fd27016-8094-45ad-9298-4f33e7692b7e/pull/0.log" Feb 24 04:01:53 crc kubenswrapper[4923]: I0224 04:01:53.606767 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t_0fd27016-8094-45ad-9298-4f33e7692b7e/pull/0.log" Feb 24 04:01:53 crc kubenswrapper[4923]: I0224 04:01:53.778917 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t_0fd27016-8094-45ad-9298-4f33e7692b7e/util/0.log" Feb 24 04:01:53 crc kubenswrapper[4923]: I0224 04:01:53.808906 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t_0fd27016-8094-45ad-9298-4f33e7692b7e/pull/0.log" Feb 24 04:01:53 crc kubenswrapper[4923]: I0224 04:01:53.824572 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5080a3b5c86ef3a7e66f6918139775eaed60ec564c94356292f836e6f16ff7t_0fd27016-8094-45ad-9298-4f33e7692b7e/extract/0.log" Feb 24 04:01:54 crc kubenswrapper[4923]: I0224 04:01:54.242504 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-n2qf9_74dbbb69-5b9b-45b1-a74e-8bed20a6cbed/manager/0.log" Feb 24 04:01:54 crc kubenswrapper[4923]: I0224 04:01:54.580789 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-58z87_202e32ae-6025-43c3-90ee-d5a6ec2f7752/manager/0.log" Feb 24 04:01:54 crc kubenswrapper[4923]: I0224 04:01:54.702178 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-tbhxh_07169ece-c03c-464a-9899-f03b61426df5/manager/0.log" Feb 24 04:01:54 crc kubenswrapper[4923]: I0224 04:01:54.932378 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-2mdh9_2dda58b4-8524-47cf-9e31-f276859d0af1/manager/0.log" Feb 24 04:01:55 crc kubenswrapper[4923]: I0224 04:01:55.480973 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-pd2zn_1dc2eca6-9aa4-4b34-a30c-bb88a78fe1d1/manager/0.log" Feb 24 04:01:55 crc kubenswrapper[4923]: I0224 04:01:55.569835 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-bmn7v_26fc13d5-b98a-49ac-8c62-ee3ab08a9767/manager/0.log" Feb 24 04:01:55 crc kubenswrapper[4923]: I0224 04:01:55.843751 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-tqvjv_6a78564f-37a8-4385-9f93-57ee3952d36c/manager/0.log" Feb 24 04:01:56 crc kubenswrapper[4923]: I0224 04:01:56.074154 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-4d7n7_7228fc47-38cb-4680-9104-d5657a853147/manager/0.log" Feb 24 04:01:56 crc kubenswrapper[4923]: I0224 04:01:56.274789 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-4c6hc_400c9c7a-a90c-4b16-b13d-25c26be22f93/manager/0.log" Feb 24 04:01:56 crc kubenswrapper[4923]: I0224 04:01:56.315481 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-msrg6_85db5c1e-a62f-496a-a8ce-0e32d4321ac9/manager/0.log" Feb 24 04:01:56 crc kubenswrapper[4923]: I0224 04:01:56.513658 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-x6vmb_039de08e-513c-47e3-a3f7-59b8911b7dae/manager/0.log" Feb 24 04:01:56 crc kubenswrapper[4923]: I0224 04:01:56.686097 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-c8n7r_063cbd60-dc19-4c47-96ca-7b9cb24bf2ef/manager/0.log" Feb 24 04:01:56 crc kubenswrapper[4923]: I0224 04:01:56.738000 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-r64q5_31f694c1-3948-4e87-90d1-5bd1e7d0aef6/manager/0.log" Feb 24 04:01:56 crc kubenswrapper[4923]: I0224 04:01:56.966762 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cvnbkw_8c5a7840-9e6b-4442-b99e-1ce50bff0722/manager/0.log" Feb 24 04:01:57 crc kubenswrapper[4923]: I0224 04:01:57.326711 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5677cd7d77-l7zkx_357fa046-8d84-43b5-8e3c-d1fe18f5d2c5/operator/0.log" Feb 24 04:01:57 crc kubenswrapper[4923]: I0224 04:01:57.499521 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zscrj_d988ae64-eb6b-4f86-a51d-5b61eb8b6d35/registry-server/0.log" Feb 24 04:01:57 crc kubenswrapper[4923]: I0224 04:01:57.677869 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-2zxl4_8b9a0e9e-0ef9-4b69-87f3-63cfb4204996/manager/0.log" Feb 24 04:01:57 crc kubenswrapper[4923]: I0224 04:01:57.755748 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-m8tjf_d368aeb3-8ad3-4ed2-8022-6dfb08b2f86e/manager/0.log" Feb 24 04:01:57 crc kubenswrapper[4923]: I0224 04:01:57.987073 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-vczgc_7e46cf81-12eb-4c37-9f04-affbd9f153b7/operator/0.log" Feb 24 04:01:58 crc kubenswrapper[4923]: I0224 04:01:58.195792 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-zxhc7_a9542edd-79bc-4eb0-8afb-7e9c61a8fb6e/manager/0.log" Feb 24 04:01:58 crc kubenswrapper[4923]: I0224 04:01:58.274103 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-589c568786-f4b5c_4981f2a3-3977-40b7-819b-59cf400fa882/manager/0.log" Feb 24 04:01:58 crc kubenswrapper[4923]: I0224 04:01:58.412755 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-5pzgk_4ba7a21e-9aef-4596-9e18-21c66394cf74/manager/0.log" Feb 24 04:01:58 crc kubenswrapper[4923]: I0224 04:01:58.586852 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-2bgtt_ae81debf-3361-424d-afbe-9e4521997d23/manager/0.log" Feb 24 04:01:58 crc kubenswrapper[4923]: I0224 04:01:58.760577 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5dd698895-qrccn_6b2c8692-7382-4716-8770-47d21209898f/manager/0.log" Feb 24 04:02:03 crc kubenswrapper[4923]: I0224 04:02:03.441989 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-lnz72_13de02b1-8017-4d32-b848-08d241ef34d4/manager/0.log" Feb 24 04:02:04 crc kubenswrapper[4923]: I0224 04:02:04.712991 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:02:04 crc kubenswrapper[4923]: E0224 04:02:04.713336 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:02:16 crc kubenswrapper[4923]: I0224 04:02:16.713618 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:02:16 crc kubenswrapper[4923]: E0224 04:02:16.714176 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:02:20 crc kubenswrapper[4923]: I0224 04:02:20.236139 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-rx89l_34d9b54c-a37d-407b-81c7-ff77a96b7dd8/control-plane-machine-set-operator/0.log" Feb 24 04:02:20 crc kubenswrapper[4923]: I0224 04:02:20.383223 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-82dxq_ae174d22-78c6-4699-9d9b-8ce566dc9f4c/kube-rbac-proxy/0.log" Feb 24 04:02:20 crc kubenswrapper[4923]: I0224 04:02:20.465010 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-82dxq_ae174d22-78c6-4699-9d9b-8ce566dc9f4c/machine-api-operator/0.log" Feb 24 04:02:27 crc kubenswrapper[4923]: I0224 04:02:27.721161 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:02:27 crc kubenswrapper[4923]: E0224 04:02:27.721941 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:02:34 crc kubenswrapper[4923]: I0224 04:02:34.761219 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-pbnww_779174d3-c69e-46be-b5e4-4210a6697e7b/cert-manager-controller/0.log" Feb 24 04:02:34 crc kubenswrapper[4923]: I0224 04:02:34.877613 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-6qbrj_b6864f21-fdba-416a-b777-a492c9c9e66c/cert-manager-cainjector/0.log" Feb 24 04:02:34 crc kubenswrapper[4923]: I0224 04:02:34.944251 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-r9qzx_60be919e-8301-45ed-9e67-6e54e6ddef7f/cert-manager-webhook/0.log" Feb 24 04:02:39 crc kubenswrapper[4923]: I0224 04:02:39.713374 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:02:39 crc kubenswrapper[4923]: E0224 04:02:39.714724 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:02:47 crc kubenswrapper[4923]: I0224 04:02:47.497539 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-bxdm5_36d48cde-8247-4729-aa2d-d6b99d25b198/nmstate-console-plugin/0.log" Feb 24 04:02:47 crc kubenswrapper[4923]: I0224 04:02:47.677171 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-fp5bb_886309d8-6744-4d32-a729-225ef9679579/nmstate-handler/0.log" Feb 24 04:02:47 crc kubenswrapper[4923]: I0224 04:02:47.740687 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-mnljj_ae5580e9-8a55-4dbe-99c8-e21e22d3813e/kube-rbac-proxy/0.log" Feb 24 04:02:47 crc kubenswrapper[4923]: I0224 04:02:47.797213 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-mnljj_ae5580e9-8a55-4dbe-99c8-e21e22d3813e/nmstate-metrics/0.log" Feb 24 04:02:47 crc kubenswrapper[4923]: I0224 04:02:47.921251 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-k582j_0bbb266f-3eb2-4f05-8d57-9c6ac88c83fd/nmstate-operator/0.log" Feb 24 04:02:47 crc kubenswrapper[4923]: I0224 04:02:47.995942 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-25qp5_8192c6db-fbcf-45b6-b43c-313abcc10d2e/nmstate-webhook/0.log" Feb 24 04:02:50 crc kubenswrapper[4923]: I0224 04:02:50.713237 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:02:50 crc kubenswrapper[4923]: E0224 04:02:50.714746 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:03:01 crc kubenswrapper[4923]: I0224 04:03:01.714077 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:03:01 crc kubenswrapper[4923]: E0224 04:03:01.715036 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:03:15 crc kubenswrapper[4923]: I0224 04:03:15.451912 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-7bl75_ebf12b42-c896-4b13-954c-1ef5753c3fc0/kube-rbac-proxy/0.log" Feb 24 04:03:15 crc kubenswrapper[4923]: I0224 04:03:15.590460 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-7bl75_ebf12b42-c896-4b13-954c-1ef5753c3fc0/controller/0.log" Feb 24 04:03:15 crc kubenswrapper[4923]: I0224 04:03:15.664738 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-8hjgx_b9c601e3-2c93-4989-a6e9-20542436ace6/frr-k8s-webhook-server/0.log" Feb 24 04:03:15 crc kubenswrapper[4923]: I0224 04:03:15.785185 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-frr-files/0.log" Feb 24 04:03:15 crc kubenswrapper[4923]: I0224 04:03:15.891021 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-reloader/0.log" Feb 24 04:03:15 crc kubenswrapper[4923]: I0224 04:03:15.902506 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-frr-files/0.log" Feb 24 04:03:15 crc kubenswrapper[4923]: I0224 04:03:15.929238 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-metrics/0.log" Feb 24 04:03:15 crc kubenswrapper[4923]: I0224 04:03:15.954863 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-reloader/0.log" Feb 24 04:03:16 crc kubenswrapper[4923]: I0224 04:03:16.095725 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-reloader/0.log" Feb 24 04:03:16 crc kubenswrapper[4923]: I0224 04:03:16.097501 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-frr-files/0.log" Feb 24 04:03:16 crc kubenswrapper[4923]: I0224 04:03:16.114539 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-metrics/0.log" Feb 24 04:03:16 crc kubenswrapper[4923]: I0224 04:03:16.144761 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-metrics/0.log" Feb 24 04:03:16 crc kubenswrapper[4923]: I0224 04:03:16.323260 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-frr-files/0.log" Feb 24 04:03:16 crc kubenswrapper[4923]: I0224 04:03:16.351936 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-metrics/0.log" Feb 24 04:03:16 crc kubenswrapper[4923]: I0224 04:03:16.352126 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/cp-reloader/0.log" Feb 24 04:03:16 crc kubenswrapper[4923]: I0224 04:03:16.356540 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/controller/0.log" Feb 24 04:03:16 crc kubenswrapper[4923]: I0224 04:03:16.492524 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/kube-rbac-proxy/0.log" Feb 24 04:03:16 crc kubenswrapper[4923]: I0224 04:03:16.509444 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/frr-metrics/0.log" Feb 24 04:03:16 crc kubenswrapper[4923]: I0224 04:03:16.551569 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/kube-rbac-proxy-frr/0.log" Feb 24 04:03:16 crc kubenswrapper[4923]: I0224 04:03:16.712627 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:03:16 crc kubenswrapper[4923]: E0224 04:03:16.712874 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:03:16 crc kubenswrapper[4923]: I0224 04:03:16.740349 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/reloader/0.log" Feb 24 04:03:16 crc kubenswrapper[4923]: I0224 04:03:16.760156 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84dbcb4757-pvzfq_475ff3bc-195d-4768-892d-1c0274b3a25c/manager/0.log" Feb 24 04:03:16 crc kubenswrapper[4923]: I0224 04:03:16.914341 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-79448499bc-9ssng_0fcd559c-f2fb-455a-9cc5-6be1cd6be98a/webhook-server/0.log" Feb 24 04:03:17 crc kubenswrapper[4923]: I0224 04:03:17.134606 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-k5l69_94515a6b-ba32-4b66-9cbf-42f9e0e38d14/kube-rbac-proxy/0.log" Feb 24 04:03:17 crc kubenswrapper[4923]: I0224 04:03:17.720437 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-k5l69_94515a6b-ba32-4b66-9cbf-42f9e0e38d14/speaker/0.log" Feb 24 04:03:18 crc kubenswrapper[4923]: I0224 04:03:18.194714 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zj8qg_20a6c50f-c649-420b-b092-7b2015b8436e/frr/0.log" Feb 24 04:03:27 crc kubenswrapper[4923]: I0224 04:03:27.721682 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:03:27 crc kubenswrapper[4923]: E0224 04:03:27.723197 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:03:32 crc kubenswrapper[4923]: I0224 04:03:32.225155 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5_69c2b6f1-7455-4c00-a61e-43ab85b9df97/util/0.log" Feb 24 04:03:32 crc kubenswrapper[4923]: I0224 04:03:32.968710 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5_69c2b6f1-7455-4c00-a61e-43ab85b9df97/util/0.log" Feb 24 04:03:32 crc kubenswrapper[4923]: I0224 04:03:32.974460 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5_69c2b6f1-7455-4c00-a61e-43ab85b9df97/pull/0.log" Feb 24 04:03:33 crc kubenswrapper[4923]: I0224 04:03:33.007056 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5_69c2b6f1-7455-4c00-a61e-43ab85b9df97/pull/0.log" Feb 24 04:03:33 crc kubenswrapper[4923]: I0224 04:03:33.181941 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5_69c2b6f1-7455-4c00-a61e-43ab85b9df97/extract/0.log" Feb 24 04:03:33 crc kubenswrapper[4923]: I0224 04:03:33.185314 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5_69c2b6f1-7455-4c00-a61e-43ab85b9df97/pull/0.log" Feb 24 04:03:33 crc kubenswrapper[4923]: I0224 04:03:33.207351 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc21356tq5_69c2b6f1-7455-4c00-a61e-43ab85b9df97/util/0.log" Feb 24 04:03:33 crc kubenswrapper[4923]: I0224 04:03:33.339857 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8gng_5d03834e-9594-476a-a7a9-1bf1aa9ade01/extract-utilities/0.log" Feb 24 04:03:33 crc kubenswrapper[4923]: I0224 04:03:33.508670 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8gng_5d03834e-9594-476a-a7a9-1bf1aa9ade01/extract-utilities/0.log" Feb 24 04:03:33 crc kubenswrapper[4923]: I0224 04:03:33.562580 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8gng_5d03834e-9594-476a-a7a9-1bf1aa9ade01/extract-content/0.log" Feb 24 04:03:33 crc kubenswrapper[4923]: I0224 04:03:33.589439 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8gng_5d03834e-9594-476a-a7a9-1bf1aa9ade01/extract-content/0.log" Feb 24 04:03:33 crc kubenswrapper[4923]: I0224 04:03:33.720368 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8gng_5d03834e-9594-476a-a7a9-1bf1aa9ade01/extract-utilities/0.log" Feb 24 04:03:33 crc kubenswrapper[4923]: I0224 04:03:33.771845 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8gng_5d03834e-9594-476a-a7a9-1bf1aa9ade01/extract-content/0.log" Feb 24 04:03:33 crc kubenswrapper[4923]: I0224 04:03:33.933015 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgg4b_1ddcf8ff-1207-46ff-9dde-08219d670309/extract-utilities/0.log" Feb 24 04:03:34 crc kubenswrapper[4923]: I0224 04:03:34.398220 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8gng_5d03834e-9594-476a-a7a9-1bf1aa9ade01/registry-server/0.log" Feb 24 04:03:34 crc kubenswrapper[4923]: I0224 04:03:34.593043 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgg4b_1ddcf8ff-1207-46ff-9dde-08219d670309/extract-content/0.log" Feb 24 04:03:34 crc kubenswrapper[4923]: I0224 04:03:34.594500 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgg4b_1ddcf8ff-1207-46ff-9dde-08219d670309/extract-utilities/0.log" Feb 24 04:03:34 crc kubenswrapper[4923]: I0224 04:03:34.618565 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgg4b_1ddcf8ff-1207-46ff-9dde-08219d670309/extract-content/0.log" Feb 24 04:03:34 crc kubenswrapper[4923]: I0224 04:03:34.776399 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgg4b_1ddcf8ff-1207-46ff-9dde-08219d670309/extract-utilities/0.log" Feb 24 04:03:34 crc kubenswrapper[4923]: I0224 04:03:34.869746 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgg4b_1ddcf8ff-1207-46ff-9dde-08219d670309/extract-content/0.log" Feb 24 04:03:35 crc kubenswrapper[4923]: I0224 04:03:35.188401 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr_01359d36-26ee-45ab-83f1-2cc5a8f360be/util/0.log" Feb 24 04:03:35 crc kubenswrapper[4923]: I0224 04:03:35.326964 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bgg4b_1ddcf8ff-1207-46ff-9dde-08219d670309/registry-server/0.log" Feb 24 04:03:35 crc kubenswrapper[4923]: I0224 04:03:35.335318 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr_01359d36-26ee-45ab-83f1-2cc5a8f360be/pull/0.log" Feb 24 04:03:35 crc kubenswrapper[4923]: I0224 04:03:35.381398 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr_01359d36-26ee-45ab-83f1-2cc5a8f360be/util/0.log" Feb 24 04:03:35 crc kubenswrapper[4923]: I0224 04:03:35.421915 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr_01359d36-26ee-45ab-83f1-2cc5a8f360be/pull/0.log" Feb 24 04:03:35 crc kubenswrapper[4923]: I0224 04:03:35.583970 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr_01359d36-26ee-45ab-83f1-2cc5a8f360be/util/0.log" Feb 24 04:03:35 crc kubenswrapper[4923]: I0224 04:03:35.596641 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr_01359d36-26ee-45ab-83f1-2cc5a8f360be/pull/0.log" Feb 24 04:03:35 crc kubenswrapper[4923]: I0224 04:03:35.611408 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca8bpnr_01359d36-26ee-45ab-83f1-2cc5a8f360be/extract/0.log" Feb 24 04:03:35 crc kubenswrapper[4923]: I0224 04:03:35.795825 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-f8kx9_05612e34-43ff-4719-9bb6-46364400281f/marketplace-operator/0.log" Feb 24 04:03:35 crc kubenswrapper[4923]: I0224 04:03:35.835412 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s2s2b_3ade27f6-4909-4b58-b29d-d7b74686d166/extract-utilities/0.log" Feb 24 04:03:36 crc kubenswrapper[4923]: I0224 04:03:36.021582 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s2s2b_3ade27f6-4909-4b58-b29d-d7b74686d166/extract-content/0.log" Feb 24 04:03:36 crc kubenswrapper[4923]: I0224 04:03:36.029969 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s2s2b_3ade27f6-4909-4b58-b29d-d7b74686d166/extract-utilities/0.log" Feb 24 04:03:36 crc kubenswrapper[4923]: I0224 04:03:36.059205 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s2s2b_3ade27f6-4909-4b58-b29d-d7b74686d166/extract-content/0.log" Feb 24 04:03:36 crc kubenswrapper[4923]: I0224 04:03:36.254979 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8kn_3bfb8ad5-974b-4507-96cf-1150c1ca8937/extract-utilities/0.log" Feb 24 04:03:36 crc kubenswrapper[4923]: I0224 04:03:36.258319 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s2s2b_3ade27f6-4909-4b58-b29d-d7b74686d166/extract-content/0.log" Feb 24 04:03:36 crc kubenswrapper[4923]: I0224 04:03:36.279193 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s2s2b_3ade27f6-4909-4b58-b29d-d7b74686d166/extract-utilities/0.log" Feb 24 04:03:36 crc kubenswrapper[4923]: I0224 04:03:36.333416 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-s2s2b_3ade27f6-4909-4b58-b29d-d7b74686d166/registry-server/0.log" Feb 24 04:03:36 crc kubenswrapper[4923]: I0224 04:03:36.429169 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8kn_3bfb8ad5-974b-4507-96cf-1150c1ca8937/extract-utilities/0.log" Feb 24 04:03:36 crc kubenswrapper[4923]: I0224 04:03:36.488013 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8kn_3bfb8ad5-974b-4507-96cf-1150c1ca8937/extract-content/0.log" Feb 24 04:03:36 crc kubenswrapper[4923]: I0224 04:03:36.490989 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8kn_3bfb8ad5-974b-4507-96cf-1150c1ca8937/extract-content/0.log" Feb 24 04:03:36 crc kubenswrapper[4923]: I0224 04:03:36.625128 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8kn_3bfb8ad5-974b-4507-96cf-1150c1ca8937/extract-utilities/0.log" Feb 24 04:03:36 crc kubenswrapper[4923]: I0224 04:03:36.673086 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8kn_3bfb8ad5-974b-4507-96cf-1150c1ca8937/extract-content/0.log" Feb 24 04:03:37 crc kubenswrapper[4923]: I0224 04:03:37.210275 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6c8kn_3bfb8ad5-974b-4507-96cf-1150c1ca8937/registry-server/0.log" Feb 24 04:03:40 crc kubenswrapper[4923]: I0224 04:03:40.714114 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:03:40 crc kubenswrapper[4923]: E0224 04:03:40.715097 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:03:55 crc kubenswrapper[4923]: I0224 04:03:55.718421 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:03:55 crc kubenswrapper[4923]: E0224 04:03:55.719434 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:04:01 crc kubenswrapper[4923]: E0224 04:04:01.719857 4923 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.194:44756->38.102.83.194:35219: read tcp 38.102.83.194:44756->38.102.83.194:35219: read: connection reset by peer Feb 24 04:04:07 crc kubenswrapper[4923]: I0224 04:04:07.720132 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:04:07 crc kubenswrapper[4923]: E0224 04:04:07.721094 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:04:18 crc kubenswrapper[4923]: I0224 04:04:18.713658 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:04:18 crc kubenswrapper[4923]: E0224 04:04:18.714387 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:04:33 crc kubenswrapper[4923]: I0224 04:04:33.713286 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:04:33 crc kubenswrapper[4923]: E0224 04:04:33.715827 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:04:41 crc kubenswrapper[4923]: I0224 04:04:41.763102 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f8cdg"] Feb 24 04:04:41 crc kubenswrapper[4923]: E0224 04:04:41.764091 4923 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="993f74fa-4266-4bab-a161-f6ffffe458a5" containerName="keystone-cron" Feb 24 04:04:41 crc kubenswrapper[4923]: I0224 04:04:41.764106 4923 state_mem.go:107] "Deleted CPUSet assignment" podUID="993f74fa-4266-4bab-a161-f6ffffe458a5" containerName="keystone-cron" Feb 24 04:04:41 crc kubenswrapper[4923]: I0224 04:04:41.764397 4923 memory_manager.go:354] "RemoveStaleState removing state" podUID="993f74fa-4266-4bab-a161-f6ffffe458a5" containerName="keystone-cron" Feb 24 04:04:41 crc kubenswrapper[4923]: I0224 04:04:41.766009 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8cdg" Feb 24 04:04:41 crc kubenswrapper[4923]: I0224 04:04:41.780413 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f8cdg"] Feb 24 04:04:41 crc kubenswrapper[4923]: I0224 04:04:41.830263 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-962x9\" (UniqueName: \"kubernetes.io/projected/1463a521-5de4-4701-ad4b-b9cd1fcdd69a-kube-api-access-962x9\") pod \"community-operators-f8cdg\" (UID: \"1463a521-5de4-4701-ad4b-b9cd1fcdd69a\") " pod="openshift-marketplace/community-operators-f8cdg" Feb 24 04:04:41 crc kubenswrapper[4923]: I0224 04:04:41.830676 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1463a521-5de4-4701-ad4b-b9cd1fcdd69a-catalog-content\") pod \"community-operators-f8cdg\" (UID: \"1463a521-5de4-4701-ad4b-b9cd1fcdd69a\") " pod="openshift-marketplace/community-operators-f8cdg" Feb 24 04:04:41 crc kubenswrapper[4923]: I0224 04:04:41.830743 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1463a521-5de4-4701-ad4b-b9cd1fcdd69a-utilities\") pod \"community-operators-f8cdg\" (UID: \"1463a521-5de4-4701-ad4b-b9cd1fcdd69a\") " pod="openshift-marketplace/community-operators-f8cdg" Feb 24 04:04:41 crc kubenswrapper[4923]: I0224 04:04:41.932566 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-962x9\" (UniqueName: \"kubernetes.io/projected/1463a521-5de4-4701-ad4b-b9cd1fcdd69a-kube-api-access-962x9\") pod \"community-operators-f8cdg\" (UID: \"1463a521-5de4-4701-ad4b-b9cd1fcdd69a\") " pod="openshift-marketplace/community-operators-f8cdg" Feb 24 04:04:41 crc kubenswrapper[4923]: I0224 04:04:41.932609 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1463a521-5de4-4701-ad4b-b9cd1fcdd69a-catalog-content\") pod \"community-operators-f8cdg\" (UID: \"1463a521-5de4-4701-ad4b-b9cd1fcdd69a\") " pod="openshift-marketplace/community-operators-f8cdg" Feb 24 04:04:41 crc kubenswrapper[4923]: I0224 04:04:41.932654 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1463a521-5de4-4701-ad4b-b9cd1fcdd69a-utilities\") pod \"community-operators-f8cdg\" (UID: \"1463a521-5de4-4701-ad4b-b9cd1fcdd69a\") " pod="openshift-marketplace/community-operators-f8cdg" Feb 24 04:04:41 crc kubenswrapper[4923]: I0224 04:04:41.933248 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1463a521-5de4-4701-ad4b-b9cd1fcdd69a-utilities\") pod \"community-operators-f8cdg\" (UID: \"1463a521-5de4-4701-ad4b-b9cd1fcdd69a\") " pod="openshift-marketplace/community-operators-f8cdg" Feb 24 04:04:41 crc kubenswrapper[4923]: I0224 04:04:41.933251 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1463a521-5de4-4701-ad4b-b9cd1fcdd69a-catalog-content\") pod \"community-operators-f8cdg\" (UID: \"1463a521-5de4-4701-ad4b-b9cd1fcdd69a\") " pod="openshift-marketplace/community-operators-f8cdg" Feb 24 04:04:41 crc kubenswrapper[4923]: I0224 04:04:41.958078 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-962x9\" (UniqueName: \"kubernetes.io/projected/1463a521-5de4-4701-ad4b-b9cd1fcdd69a-kube-api-access-962x9\") pod \"community-operators-f8cdg\" (UID: \"1463a521-5de4-4701-ad4b-b9cd1fcdd69a\") " pod="openshift-marketplace/community-operators-f8cdg" Feb 24 04:04:42 crc kubenswrapper[4923]: I0224 04:04:42.089411 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8cdg" Feb 24 04:04:42 crc kubenswrapper[4923]: I0224 04:04:42.376392 4923 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t859p"] Feb 24 04:04:42 crc kubenswrapper[4923]: I0224 04:04:42.379051 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t859p" Feb 24 04:04:42 crc kubenswrapper[4923]: I0224 04:04:42.385718 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t859p"] Feb 24 04:04:42 crc kubenswrapper[4923]: I0224 04:04:42.442612 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84f0e622-22f7-422d-bc13-858de576f94a-catalog-content\") pod \"redhat-marketplace-t859p\" (UID: \"84f0e622-22f7-422d-bc13-858de576f94a\") " pod="openshift-marketplace/redhat-marketplace-t859p" Feb 24 04:04:42 crc kubenswrapper[4923]: I0224 04:04:42.442669 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84f0e622-22f7-422d-bc13-858de576f94a-utilities\") pod \"redhat-marketplace-t859p\" (UID: \"84f0e622-22f7-422d-bc13-858de576f94a\") " pod="openshift-marketplace/redhat-marketplace-t859p" Feb 24 04:04:42 crc kubenswrapper[4923]: I0224 04:04:42.442768 4923 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vspnm\" (UniqueName: \"kubernetes.io/projected/84f0e622-22f7-422d-bc13-858de576f94a-kube-api-access-vspnm\") pod \"redhat-marketplace-t859p\" (UID: \"84f0e622-22f7-422d-bc13-858de576f94a\") " pod="openshift-marketplace/redhat-marketplace-t859p" Feb 24 04:04:42 crc kubenswrapper[4923]: I0224 04:04:42.544240 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vspnm\" (UniqueName: \"kubernetes.io/projected/84f0e622-22f7-422d-bc13-858de576f94a-kube-api-access-vspnm\") pod \"redhat-marketplace-t859p\" (UID: \"84f0e622-22f7-422d-bc13-858de576f94a\") " pod="openshift-marketplace/redhat-marketplace-t859p" Feb 24 04:04:42 crc kubenswrapper[4923]: I0224 04:04:42.544399 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84f0e622-22f7-422d-bc13-858de576f94a-catalog-content\") pod \"redhat-marketplace-t859p\" (UID: \"84f0e622-22f7-422d-bc13-858de576f94a\") " pod="openshift-marketplace/redhat-marketplace-t859p" Feb 24 04:04:42 crc kubenswrapper[4923]: I0224 04:04:42.544433 4923 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84f0e622-22f7-422d-bc13-858de576f94a-utilities\") pod \"redhat-marketplace-t859p\" (UID: \"84f0e622-22f7-422d-bc13-858de576f94a\") " pod="openshift-marketplace/redhat-marketplace-t859p" Feb 24 04:04:42 crc kubenswrapper[4923]: I0224 04:04:42.544935 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84f0e622-22f7-422d-bc13-858de576f94a-catalog-content\") pod \"redhat-marketplace-t859p\" (UID: \"84f0e622-22f7-422d-bc13-858de576f94a\") " pod="openshift-marketplace/redhat-marketplace-t859p" Feb 24 04:04:42 crc kubenswrapper[4923]: I0224 04:04:42.544966 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84f0e622-22f7-422d-bc13-858de576f94a-utilities\") pod \"redhat-marketplace-t859p\" (UID: \"84f0e622-22f7-422d-bc13-858de576f94a\") " pod="openshift-marketplace/redhat-marketplace-t859p" Feb 24 04:04:42 crc kubenswrapper[4923]: I0224 04:04:42.567623 4923 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vspnm\" (UniqueName: \"kubernetes.io/projected/84f0e622-22f7-422d-bc13-858de576f94a-kube-api-access-vspnm\") pod \"redhat-marketplace-t859p\" (UID: \"84f0e622-22f7-422d-bc13-858de576f94a\") " pod="openshift-marketplace/redhat-marketplace-t859p" Feb 24 04:04:42 crc kubenswrapper[4923]: I0224 04:04:42.707481 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f8cdg"] Feb 24 04:04:42 crc kubenswrapper[4923]: I0224 04:04:42.717841 4923 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t859p" Feb 24 04:04:43 crc kubenswrapper[4923]: I0224 04:04:43.183035 4923 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t859p"] Feb 24 04:04:43 crc kubenswrapper[4923]: W0224 04:04:43.190373 4923 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84f0e622_22f7_422d_bc13_858de576f94a.slice/crio-d594959e9d2222071c4f815eda8be10ff11901bde3c1538ca10bfc884e02db2b WatchSource:0}: Error finding container d594959e9d2222071c4f815eda8be10ff11901bde3c1538ca10bfc884e02db2b: Status 404 returned error can't find the container with id d594959e9d2222071c4f815eda8be10ff11901bde3c1538ca10bfc884e02db2b Feb 24 04:04:43 crc kubenswrapper[4923]: I0224 04:04:43.641994 4923 generic.go:334] "Generic (PLEG): container finished" podID="1463a521-5de4-4701-ad4b-b9cd1fcdd69a" containerID="396a344915a2961b794aabf047f049d96683e4b4931395d551d0321edd46304d" exitCode=0 Feb 24 04:04:43 crc kubenswrapper[4923]: I0224 04:04:43.642091 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8cdg" event={"ID":"1463a521-5de4-4701-ad4b-b9cd1fcdd69a","Type":"ContainerDied","Data":"396a344915a2961b794aabf047f049d96683e4b4931395d551d0321edd46304d"} Feb 24 04:04:43 crc kubenswrapper[4923]: I0224 04:04:43.642128 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8cdg" event={"ID":"1463a521-5de4-4701-ad4b-b9cd1fcdd69a","Type":"ContainerStarted","Data":"73071b9594c3b22847b10cdd774710bb38fb9ee155d52a4dc237122e6b3cf0a2"} Feb 24 04:04:43 crc kubenswrapper[4923]: I0224 04:04:43.649116 4923 generic.go:334] "Generic (PLEG): container finished" podID="84f0e622-22f7-422d-bc13-858de576f94a" containerID="716d33114d8a8a964f9726a5f4f65be4af8ae847ed0a3b80469f3908e8fa98e6" exitCode=0 Feb 24 04:04:43 crc kubenswrapper[4923]: I0224 04:04:43.649162 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t859p" event={"ID":"84f0e622-22f7-422d-bc13-858de576f94a","Type":"ContainerDied","Data":"716d33114d8a8a964f9726a5f4f65be4af8ae847ed0a3b80469f3908e8fa98e6"} Feb 24 04:04:43 crc kubenswrapper[4923]: I0224 04:04:43.649191 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t859p" event={"ID":"84f0e622-22f7-422d-bc13-858de576f94a","Type":"ContainerStarted","Data":"d594959e9d2222071c4f815eda8be10ff11901bde3c1538ca10bfc884e02db2b"} Feb 24 04:04:44 crc kubenswrapper[4923]: I0224 04:04:44.662895 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8cdg" event={"ID":"1463a521-5de4-4701-ad4b-b9cd1fcdd69a","Type":"ContainerStarted","Data":"c16a4fe1fc463db6d6c7c6494cc459351b49e336d6198c58d96f58a239b341a8"} Feb 24 04:04:45 crc kubenswrapper[4923]: I0224 04:04:45.679237 4923 generic.go:334] "Generic (PLEG): container finished" podID="1463a521-5de4-4701-ad4b-b9cd1fcdd69a" containerID="c16a4fe1fc463db6d6c7c6494cc459351b49e336d6198c58d96f58a239b341a8" exitCode=0 Feb 24 04:04:45 crc kubenswrapper[4923]: I0224 04:04:45.679284 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8cdg" event={"ID":"1463a521-5de4-4701-ad4b-b9cd1fcdd69a","Type":"ContainerDied","Data":"c16a4fe1fc463db6d6c7c6494cc459351b49e336d6198c58d96f58a239b341a8"} Feb 24 04:04:45 crc kubenswrapper[4923]: I0224 04:04:45.685328 4923 generic.go:334] "Generic (PLEG): container finished" podID="84f0e622-22f7-422d-bc13-858de576f94a" containerID="bccb240ecd869caccd181df5f4ccf9164665e9a6ac721cd23286250ca3b16b4b" exitCode=0 Feb 24 04:04:45 crc kubenswrapper[4923]: I0224 04:04:45.685397 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t859p" event={"ID":"84f0e622-22f7-422d-bc13-858de576f94a","Type":"ContainerDied","Data":"bccb240ecd869caccd181df5f4ccf9164665e9a6ac721cd23286250ca3b16b4b"} Feb 24 04:04:45 crc kubenswrapper[4923]: I0224 04:04:45.713059 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:04:45 crc kubenswrapper[4923]: E0224 04:04:45.713481 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:04:46 crc kubenswrapper[4923]: I0224 04:04:46.706368 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8cdg" event={"ID":"1463a521-5de4-4701-ad4b-b9cd1fcdd69a","Type":"ContainerStarted","Data":"de907626d389063ece86895850dd3529650056a3183efae3e3e49d644c2ac795"} Feb 24 04:04:46 crc kubenswrapper[4923]: I0224 04:04:46.709754 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t859p" event={"ID":"84f0e622-22f7-422d-bc13-858de576f94a","Type":"ContainerStarted","Data":"e05cd5a0aaa6b9c567ec7acdae1828c056280b3c6a43869f38e3113df3aea1be"} Feb 24 04:04:46 crc kubenswrapper[4923]: I0224 04:04:46.726501 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f8cdg" podStartSLOduration=3.184750642 podStartE2EDuration="5.726482031s" podCreationTimestamp="2026-02-24 04:04:41 +0000 UTC" firstStartedPulling="2026-02-24 04:04:43.64399917 +0000 UTC m=+4207.661070003" lastFinishedPulling="2026-02-24 04:04:46.185730579 +0000 UTC m=+4210.202801392" observedRunningTime="2026-02-24 04:04:46.722899988 +0000 UTC m=+4210.739970791" watchObservedRunningTime="2026-02-24 04:04:46.726482031 +0000 UTC m=+4210.743552844" Feb 24 04:04:46 crc kubenswrapper[4923]: I0224 04:04:46.754474 4923 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t859p" podStartSLOduration=2.249380333 podStartE2EDuration="4.754450958s" podCreationTimestamp="2026-02-24 04:04:42 +0000 UTC" firstStartedPulling="2026-02-24 04:04:43.652270205 +0000 UTC m=+4207.669341038" lastFinishedPulling="2026-02-24 04:04:46.15734084 +0000 UTC m=+4210.174411663" observedRunningTime="2026-02-24 04:04:46.742778315 +0000 UTC m=+4210.759849128" watchObservedRunningTime="2026-02-24 04:04:46.754450958 +0000 UTC m=+4210.771521781" Feb 24 04:04:52 crc kubenswrapper[4923]: I0224 04:04:52.089745 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f8cdg" Feb 24 04:04:52 crc kubenswrapper[4923]: I0224 04:04:52.090562 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f8cdg" Feb 24 04:04:52 crc kubenswrapper[4923]: I0224 04:04:52.162974 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f8cdg" Feb 24 04:04:52 crc kubenswrapper[4923]: I0224 04:04:52.718274 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t859p" Feb 24 04:04:52 crc kubenswrapper[4923]: I0224 04:04:52.718942 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t859p" Feb 24 04:04:52 crc kubenswrapper[4923]: I0224 04:04:52.804495 4923 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t859p" Feb 24 04:04:52 crc kubenswrapper[4923]: I0224 04:04:52.874063 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f8cdg" Feb 24 04:04:53 crc kubenswrapper[4923]: I0224 04:04:53.849165 4923 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t859p" Feb 24 04:04:55 crc kubenswrapper[4923]: I0224 04:04:55.350146 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f8cdg"] Feb 24 04:04:55 crc kubenswrapper[4923]: I0224 04:04:55.350750 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f8cdg" podUID="1463a521-5de4-4701-ad4b-b9cd1fcdd69a" containerName="registry-server" containerID="cri-o://de907626d389063ece86895850dd3529650056a3183efae3e3e49d644c2ac795" gracePeriod=2 Feb 24 04:04:55 crc kubenswrapper[4923]: I0224 04:04:55.800702 4923 generic.go:334] "Generic (PLEG): container finished" podID="1463a521-5de4-4701-ad4b-b9cd1fcdd69a" containerID="de907626d389063ece86895850dd3529650056a3183efae3e3e49d644c2ac795" exitCode=0 Feb 24 04:04:55 crc kubenswrapper[4923]: I0224 04:04:55.801077 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8cdg" event={"ID":"1463a521-5de4-4701-ad4b-b9cd1fcdd69a","Type":"ContainerDied","Data":"de907626d389063ece86895850dd3529650056a3183efae3e3e49d644c2ac795"} Feb 24 04:04:55 crc kubenswrapper[4923]: I0224 04:04:55.899653 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8cdg" Feb 24 04:04:55 crc kubenswrapper[4923]: I0224 04:04:55.949771 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t859p"] Feb 24 04:04:55 crc kubenswrapper[4923]: I0224 04:04:55.950177 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t859p" podUID="84f0e622-22f7-422d-bc13-858de576f94a" containerName="registry-server" containerID="cri-o://e05cd5a0aaa6b9c567ec7acdae1828c056280b3c6a43869f38e3113df3aea1be" gracePeriod=2 Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.025181 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1463a521-5de4-4701-ad4b-b9cd1fcdd69a-utilities\") pod \"1463a521-5de4-4701-ad4b-b9cd1fcdd69a\" (UID: \"1463a521-5de4-4701-ad4b-b9cd1fcdd69a\") " Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.025285 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-962x9\" (UniqueName: \"kubernetes.io/projected/1463a521-5de4-4701-ad4b-b9cd1fcdd69a-kube-api-access-962x9\") pod \"1463a521-5de4-4701-ad4b-b9cd1fcdd69a\" (UID: \"1463a521-5de4-4701-ad4b-b9cd1fcdd69a\") " Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.025426 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1463a521-5de4-4701-ad4b-b9cd1fcdd69a-catalog-content\") pod \"1463a521-5de4-4701-ad4b-b9cd1fcdd69a\" (UID: \"1463a521-5de4-4701-ad4b-b9cd1fcdd69a\") " Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.026151 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1463a521-5de4-4701-ad4b-b9cd1fcdd69a-utilities" (OuterVolumeSpecName: "utilities") pod "1463a521-5de4-4701-ad4b-b9cd1fcdd69a" (UID: "1463a521-5de4-4701-ad4b-b9cd1fcdd69a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.039093 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1463a521-5de4-4701-ad4b-b9cd1fcdd69a-kube-api-access-962x9" (OuterVolumeSpecName: "kube-api-access-962x9") pod "1463a521-5de4-4701-ad4b-b9cd1fcdd69a" (UID: "1463a521-5de4-4701-ad4b-b9cd1fcdd69a"). InnerVolumeSpecName "kube-api-access-962x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.086322 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1463a521-5de4-4701-ad4b-b9cd1fcdd69a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1463a521-5de4-4701-ad4b-b9cd1fcdd69a" (UID: "1463a521-5de4-4701-ad4b-b9cd1fcdd69a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.127933 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1463a521-5de4-4701-ad4b-b9cd1fcdd69a-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.127990 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-962x9\" (UniqueName: \"kubernetes.io/projected/1463a521-5de4-4701-ad4b-b9cd1fcdd69a-kube-api-access-962x9\") on node \"crc\" DevicePath \"\"" Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.128009 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1463a521-5de4-4701-ad4b-b9cd1fcdd69a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.352642 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t859p" Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.435927 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vspnm\" (UniqueName: \"kubernetes.io/projected/84f0e622-22f7-422d-bc13-858de576f94a-kube-api-access-vspnm\") pod \"84f0e622-22f7-422d-bc13-858de576f94a\" (UID: \"84f0e622-22f7-422d-bc13-858de576f94a\") " Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.436113 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84f0e622-22f7-422d-bc13-858de576f94a-catalog-content\") pod \"84f0e622-22f7-422d-bc13-858de576f94a\" (UID: \"84f0e622-22f7-422d-bc13-858de576f94a\") " Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.436216 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84f0e622-22f7-422d-bc13-858de576f94a-utilities\") pod \"84f0e622-22f7-422d-bc13-858de576f94a\" (UID: \"84f0e622-22f7-422d-bc13-858de576f94a\") " Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.437417 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84f0e622-22f7-422d-bc13-858de576f94a-utilities" (OuterVolumeSpecName: "utilities") pod "84f0e622-22f7-422d-bc13-858de576f94a" (UID: "84f0e622-22f7-422d-bc13-858de576f94a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.439717 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84f0e622-22f7-422d-bc13-858de576f94a-kube-api-access-vspnm" (OuterVolumeSpecName: "kube-api-access-vspnm") pod "84f0e622-22f7-422d-bc13-858de576f94a" (UID: "84f0e622-22f7-422d-bc13-858de576f94a"). InnerVolumeSpecName "kube-api-access-vspnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.459968 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84f0e622-22f7-422d-bc13-858de576f94a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84f0e622-22f7-422d-bc13-858de576f94a" (UID: "84f0e622-22f7-422d-bc13-858de576f94a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.540612 4923 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84f0e622-22f7-422d-bc13-858de576f94a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.540793 4923 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84f0e622-22f7-422d-bc13-858de576f94a-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.540858 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vspnm\" (UniqueName: \"kubernetes.io/projected/84f0e622-22f7-422d-bc13-858de576f94a-kube-api-access-vspnm\") on node \"crc\" DevicePath \"\"" Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.815413 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8cdg" event={"ID":"1463a521-5de4-4701-ad4b-b9cd1fcdd69a","Type":"ContainerDied","Data":"73071b9594c3b22847b10cdd774710bb38fb9ee155d52a4dc237122e6b3cf0a2"} Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.815484 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8cdg" Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.815947 4923 scope.go:117] "RemoveContainer" containerID="de907626d389063ece86895850dd3529650056a3183efae3e3e49d644c2ac795" Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.820330 4923 generic.go:334] "Generic (PLEG): container finished" podID="84f0e622-22f7-422d-bc13-858de576f94a" containerID="e05cd5a0aaa6b9c567ec7acdae1828c056280b3c6a43869f38e3113df3aea1be" exitCode=0 Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.820383 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t859p" event={"ID":"84f0e622-22f7-422d-bc13-858de576f94a","Type":"ContainerDied","Data":"e05cd5a0aaa6b9c567ec7acdae1828c056280b3c6a43869f38e3113df3aea1be"} Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.820416 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t859p" event={"ID":"84f0e622-22f7-422d-bc13-858de576f94a","Type":"ContainerDied","Data":"d594959e9d2222071c4f815eda8be10ff11901bde3c1538ca10bfc884e02db2b"} Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.820457 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t859p" Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.876866 4923 scope.go:117] "RemoveContainer" containerID="c16a4fe1fc463db6d6c7c6494cc459351b49e336d6198c58d96f58a239b341a8" Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.893543 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f8cdg"] Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.908000 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f8cdg"] Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.915999 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t859p"] Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.917541 4923 scope.go:117] "RemoveContainer" containerID="396a344915a2961b794aabf047f049d96683e4b4931395d551d0321edd46304d" Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.925666 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t859p"] Feb 24 04:04:56 crc kubenswrapper[4923]: I0224 04:04:56.961711 4923 scope.go:117] "RemoveContainer" containerID="e05cd5a0aaa6b9c567ec7acdae1828c056280b3c6a43869f38e3113df3aea1be" Feb 24 04:04:57 crc kubenswrapper[4923]: I0224 04:04:57.002792 4923 scope.go:117] "RemoveContainer" containerID="bccb240ecd869caccd181df5f4ccf9164665e9a6ac721cd23286250ca3b16b4b" Feb 24 04:04:57 crc kubenswrapper[4923]: I0224 04:04:57.024433 4923 scope.go:117] "RemoveContainer" containerID="716d33114d8a8a964f9726a5f4f65be4af8ae847ed0a3b80469f3908e8fa98e6" Feb 24 04:04:57 crc kubenswrapper[4923]: I0224 04:04:57.073418 4923 scope.go:117] "RemoveContainer" containerID="e05cd5a0aaa6b9c567ec7acdae1828c056280b3c6a43869f38e3113df3aea1be" Feb 24 04:04:57 crc kubenswrapper[4923]: E0224 04:04:57.073996 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e05cd5a0aaa6b9c567ec7acdae1828c056280b3c6a43869f38e3113df3aea1be\": container with ID starting with e05cd5a0aaa6b9c567ec7acdae1828c056280b3c6a43869f38e3113df3aea1be not found: ID does not exist" containerID="e05cd5a0aaa6b9c567ec7acdae1828c056280b3c6a43869f38e3113df3aea1be" Feb 24 04:04:57 crc kubenswrapper[4923]: I0224 04:04:57.074133 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e05cd5a0aaa6b9c567ec7acdae1828c056280b3c6a43869f38e3113df3aea1be"} err="failed to get container status \"e05cd5a0aaa6b9c567ec7acdae1828c056280b3c6a43869f38e3113df3aea1be\": rpc error: code = NotFound desc = could not find container \"e05cd5a0aaa6b9c567ec7acdae1828c056280b3c6a43869f38e3113df3aea1be\": container with ID starting with e05cd5a0aaa6b9c567ec7acdae1828c056280b3c6a43869f38e3113df3aea1be not found: ID does not exist" Feb 24 04:04:57 crc kubenswrapper[4923]: I0224 04:04:57.074231 4923 scope.go:117] "RemoveContainer" containerID="bccb240ecd869caccd181df5f4ccf9164665e9a6ac721cd23286250ca3b16b4b" Feb 24 04:04:57 crc kubenswrapper[4923]: E0224 04:04:57.074611 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bccb240ecd869caccd181df5f4ccf9164665e9a6ac721cd23286250ca3b16b4b\": container with ID starting with bccb240ecd869caccd181df5f4ccf9164665e9a6ac721cd23286250ca3b16b4b not found: ID does not exist" containerID="bccb240ecd869caccd181df5f4ccf9164665e9a6ac721cd23286250ca3b16b4b" Feb 24 04:04:57 crc kubenswrapper[4923]: I0224 04:04:57.074644 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bccb240ecd869caccd181df5f4ccf9164665e9a6ac721cd23286250ca3b16b4b"} err="failed to get container status \"bccb240ecd869caccd181df5f4ccf9164665e9a6ac721cd23286250ca3b16b4b\": rpc error: code = NotFound desc = could not find container \"bccb240ecd869caccd181df5f4ccf9164665e9a6ac721cd23286250ca3b16b4b\": container with ID starting with bccb240ecd869caccd181df5f4ccf9164665e9a6ac721cd23286250ca3b16b4b not found: ID does not exist" Feb 24 04:04:57 crc kubenswrapper[4923]: I0224 04:04:57.074669 4923 scope.go:117] "RemoveContainer" containerID="716d33114d8a8a964f9726a5f4f65be4af8ae847ed0a3b80469f3908e8fa98e6" Feb 24 04:04:57 crc kubenswrapper[4923]: E0224 04:04:57.078778 4923 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"716d33114d8a8a964f9726a5f4f65be4af8ae847ed0a3b80469f3908e8fa98e6\": container with ID starting with 716d33114d8a8a964f9726a5f4f65be4af8ae847ed0a3b80469f3908e8fa98e6 not found: ID does not exist" containerID="716d33114d8a8a964f9726a5f4f65be4af8ae847ed0a3b80469f3908e8fa98e6" Feb 24 04:04:57 crc kubenswrapper[4923]: I0224 04:04:57.078832 4923 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"716d33114d8a8a964f9726a5f4f65be4af8ae847ed0a3b80469f3908e8fa98e6"} err="failed to get container status \"716d33114d8a8a964f9726a5f4f65be4af8ae847ed0a3b80469f3908e8fa98e6\": rpc error: code = NotFound desc = could not find container \"716d33114d8a8a964f9726a5f4f65be4af8ae847ed0a3b80469f3908e8fa98e6\": container with ID starting with 716d33114d8a8a964f9726a5f4f65be4af8ae847ed0a3b80469f3908e8fa98e6 not found: ID does not exist" Feb 24 04:04:57 crc kubenswrapper[4923]: I0224 04:04:57.731409 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1463a521-5de4-4701-ad4b-b9cd1fcdd69a" path="/var/lib/kubelet/pods/1463a521-5de4-4701-ad4b-b9cd1fcdd69a/volumes" Feb 24 04:04:57 crc kubenswrapper[4923]: I0224 04:04:57.732220 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84f0e622-22f7-422d-bc13-858de576f94a" path="/var/lib/kubelet/pods/84f0e622-22f7-422d-bc13-858de576f94a/volumes" Feb 24 04:05:00 crc kubenswrapper[4923]: I0224 04:05:00.718618 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:05:00 crc kubenswrapper[4923]: E0224 04:05:00.719829 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:05:13 crc kubenswrapper[4923]: I0224 04:05:13.714053 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:05:13 crc kubenswrapper[4923]: E0224 04:05:13.715066 4923 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rh26t_openshift-machine-config-operator(f2467bf1-1ba4-491e-b677-79c589f353ec)\"" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" Feb 24 04:05:24 crc kubenswrapper[4923]: I0224 04:05:24.136152 4923 generic.go:334] "Generic (PLEG): container finished" podID="a6a2dd88-cbaa-4497-9978-ddec403316a2" containerID="22098e4b20089e59beaa08bf74fc42d06b2fbc0b25a3e98da5b26288be3997ba" exitCode=0 Feb 24 04:05:24 crc kubenswrapper[4923]: I0224 04:05:24.136236 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-przsk/must-gather-w6qlt" event={"ID":"a6a2dd88-cbaa-4497-9978-ddec403316a2","Type":"ContainerDied","Data":"22098e4b20089e59beaa08bf74fc42d06b2fbc0b25a3e98da5b26288be3997ba"} Feb 24 04:05:24 crc kubenswrapper[4923]: I0224 04:05:24.137771 4923 scope.go:117] "RemoveContainer" containerID="22098e4b20089e59beaa08bf74fc42d06b2fbc0b25a3e98da5b26288be3997ba" Feb 24 04:05:25 crc kubenswrapper[4923]: I0224 04:05:25.170479 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-przsk_must-gather-w6qlt_a6a2dd88-cbaa-4497-9978-ddec403316a2/gather/0.log" Feb 24 04:05:27 crc kubenswrapper[4923]: I0224 04:05:27.720935 4923 scope.go:117] "RemoveContainer" containerID="4bfec54ebaaa163629b68ead386c53c563d2356674d8e97e6b0d86cefa2f68d3" Feb 24 04:05:29 crc kubenswrapper[4923]: I0224 04:05:29.187808 4923 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" event={"ID":"f2467bf1-1ba4-491e-b677-79c589f353ec","Type":"ContainerStarted","Data":"15981a3b4b7eb7479073d0d71c9e48270b49497214dc69b55a1da2a14c45c1d8"} Feb 24 04:05:36 crc kubenswrapper[4923]: I0224 04:05:36.009388 4923 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-przsk/must-gather-w6qlt"] Feb 24 04:05:36 crc kubenswrapper[4923]: I0224 04:05:36.011066 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-przsk/must-gather-w6qlt" podUID="a6a2dd88-cbaa-4497-9978-ddec403316a2" containerName="copy" containerID="cri-o://d0489b28e88b3e56edb268637bfe60f72375dbf5e698c6f1a84f375de0311bf7" gracePeriod=2 Feb 24 04:05:36 crc kubenswrapper[4923]: I0224 04:05:36.022119 4923 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-przsk/must-gather-w6qlt"] Feb 24 04:05:36 crc kubenswrapper[4923]: I0224 04:05:36.269965 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-przsk_must-gather-w6qlt_a6a2dd88-cbaa-4497-9978-ddec403316a2/copy/0.log" Feb 24 04:05:36 crc kubenswrapper[4923]: I0224 04:05:36.270426 4923 generic.go:334] "Generic (PLEG): container finished" podID="a6a2dd88-cbaa-4497-9978-ddec403316a2" containerID="d0489b28e88b3e56edb268637bfe60f72375dbf5e698c6f1a84f375de0311bf7" exitCode=143 Feb 24 04:05:36 crc kubenswrapper[4923]: I0224 04:05:36.798285 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-przsk_must-gather-w6qlt_a6a2dd88-cbaa-4497-9978-ddec403316a2/copy/0.log" Feb 24 04:05:36 crc kubenswrapper[4923]: I0224 04:05:36.799069 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-przsk/must-gather-w6qlt" Feb 24 04:05:36 crc kubenswrapper[4923]: I0224 04:05:36.812926 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fb9m\" (UniqueName: \"kubernetes.io/projected/a6a2dd88-cbaa-4497-9978-ddec403316a2-kube-api-access-6fb9m\") pod \"a6a2dd88-cbaa-4497-9978-ddec403316a2\" (UID: \"a6a2dd88-cbaa-4497-9978-ddec403316a2\") " Feb 24 04:05:36 crc kubenswrapper[4923]: I0224 04:05:36.813155 4923 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a6a2dd88-cbaa-4497-9978-ddec403316a2-must-gather-output\") pod \"a6a2dd88-cbaa-4497-9978-ddec403316a2\" (UID: \"a6a2dd88-cbaa-4497-9978-ddec403316a2\") " Feb 24 04:05:36 crc kubenswrapper[4923]: I0224 04:05:36.819841 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6a2dd88-cbaa-4497-9978-ddec403316a2-kube-api-access-6fb9m" (OuterVolumeSpecName: "kube-api-access-6fb9m") pod "a6a2dd88-cbaa-4497-9978-ddec403316a2" (UID: "a6a2dd88-cbaa-4497-9978-ddec403316a2"). InnerVolumeSpecName "kube-api-access-6fb9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 04:05:36 crc kubenswrapper[4923]: I0224 04:05:36.915385 4923 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fb9m\" (UniqueName: \"kubernetes.io/projected/a6a2dd88-cbaa-4497-9978-ddec403316a2-kube-api-access-6fb9m\") on node \"crc\" DevicePath \"\"" Feb 24 04:05:36 crc kubenswrapper[4923]: I0224 04:05:36.952457 4923 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6a2dd88-cbaa-4497-9978-ddec403316a2-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a6a2dd88-cbaa-4497-9978-ddec403316a2" (UID: "a6a2dd88-cbaa-4497-9978-ddec403316a2"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 04:05:37 crc kubenswrapper[4923]: I0224 04:05:37.016851 4923 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a6a2dd88-cbaa-4497-9978-ddec403316a2-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 24 04:05:37 crc kubenswrapper[4923]: I0224 04:05:37.279289 4923 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-przsk_must-gather-w6qlt_a6a2dd88-cbaa-4497-9978-ddec403316a2/copy/0.log" Feb 24 04:05:37 crc kubenswrapper[4923]: I0224 04:05:37.279754 4923 scope.go:117] "RemoveContainer" containerID="d0489b28e88b3e56edb268637bfe60f72375dbf5e698c6f1a84f375de0311bf7" Feb 24 04:05:37 crc kubenswrapper[4923]: I0224 04:05:37.279817 4923 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-przsk/must-gather-w6qlt" Feb 24 04:05:37 crc kubenswrapper[4923]: I0224 04:05:37.297514 4923 scope.go:117] "RemoveContainer" containerID="22098e4b20089e59beaa08bf74fc42d06b2fbc0b25a3e98da5b26288be3997ba" Feb 24 04:05:37 crc kubenswrapper[4923]: I0224 04:05:37.726016 4923 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6a2dd88-cbaa-4497-9978-ddec403316a2" path="/var/lib/kubelet/pods/a6a2dd88-cbaa-4497-9978-ddec403316a2/volumes" Feb 24 04:06:59 crc kubenswrapper[4923]: I0224 04:06:59.502145 4923 scope.go:117] "RemoveContainer" containerID="cdfad27bec8058c5e8549f7d99bb55ef1447577b00c265c12364146d679ead3f" Feb 24 04:06:59 crc kubenswrapper[4923]: I0224 04:06:59.540536 4923 scope.go:117] "RemoveContainer" containerID="55371fa78305f9a6466e15d46a9b35dc6a5986eb2eeb46ad4a9294c2f2a67914" Feb 24 04:07:49 crc kubenswrapper[4923]: I0224 04:07:49.917022 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 04:07:49 crc kubenswrapper[4923]: I0224 04:07:49.918548 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 04:08:19 crc kubenswrapper[4923]: I0224 04:08:19.916239 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 04:08:19 crc kubenswrapper[4923]: I0224 04:08:19.916846 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 04:08:49 crc kubenswrapper[4923]: I0224 04:08:49.916281 4923 patch_prober.go:28] interesting pod/machine-config-daemon-rh26t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 04:08:49 crc kubenswrapper[4923]: I0224 04:08:49.916887 4923 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 04:08:49 crc kubenswrapper[4923]: I0224 04:08:49.916929 4923 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" Feb 24 04:08:49 crc kubenswrapper[4923]: I0224 04:08:49.917707 4923 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"15981a3b4b7eb7479073d0d71c9e48270b49497214dc69b55a1da2a14c45c1d8"} pod="openshift-machine-config-operator/machine-config-daemon-rh26t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 04:08:49 crc kubenswrapper[4923]: I0224 04:08:49.917779 4923 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rh26t" podUID="f2467bf1-1ba4-491e-b677-79c589f353ec" containerName="machine-config-daemon" containerID="cri-o://15981a3b4b7eb7479073d0d71c9e48270b49497214dc69b55a1da2a14c45c1d8" gracePeriod=600